[ 607.993808] env[62227]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 608.611920] env[62277]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 609.943663] env[62277]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=62277) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 609.944057] env[62277]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=62277) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 609.944208] env[62277]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=62277) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 609.944553] env[62277]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 610.148444] env[62277]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=62277) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} [ 610.158721] env[62277]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=62277) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} [ 610.262043] env[62277]: INFO nova.virt.driver [None req-ea3a2f45-036a-42fa-88ae-34939f2e789d None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 610.339695] env[62277]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 610.339875] env[62277]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 610.339979] env[62277]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=62277) __init__ /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:242}} [ 613.176444] env[62277]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-e9891750-c0b3-497e-8f37-cc57b34e15fe {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 613.192783] env[62277]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=62277) _create_session /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:242}} [ 613.192960] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-1ec77b16-bb9e-42d6-a647-69d3b240103b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 613.224141] env[62277]: INFO oslo_vmware.api [-] Successfully established new session; session ID is f7653. [ 613.224314] env[62277]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 2.884s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 613.224842] env[62277]: INFO nova.virt.vmwareapi.driver [None req-ea3a2f45-036a-42fa-88ae-34939f2e789d None None] VMware vCenter version: 7.0.3 [ 613.228371] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0433494-94c9-4376-9c83-1c1bb7b55042 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 613.249649] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c286dc6-6113-4806-8cfd-2f5223d13a0c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 613.255855] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c39d642-35fe-4392-810e-47486d322631 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 613.262414] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-021a6681-7135-42e3-ad21-22fa253ae015 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 613.275323] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ab7625e-8cda-4685-a662-28811f41a02a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 613.281110] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dad9a8d3-9baf-4932-82e2-bfac3a43d7aa {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 613.311468] env[62277]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-744509d5-5932-4bd8-9f41-44c1fa4dc73e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 613.316537] env[62277]: DEBUG nova.virt.vmwareapi.driver [None req-ea3a2f45-036a-42fa-88ae-34939f2e789d None None] Extension org.openstack.compute already exists. {{(pid=62277) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:224}} [ 613.319183] env[62277]: INFO nova.compute.provider_config [None req-ea3a2f45-036a-42fa-88ae-34939f2e789d None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 613.337693] env[62277]: DEBUG nova.context [None req-ea3a2f45-036a-42fa-88ae-34939f2e789d None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),2dd635c4-71a4-4a6b-a010-d038221d39c7(cell1) {{(pid=62277) load_cells /opt/stack/nova/nova/context.py:464}} [ 613.339582] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 613.339806] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 613.340529] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 613.341102] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] Acquiring lock "2dd635c4-71a4-4a6b-a010-d038221d39c7" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 613.341299] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] Lock "2dd635c4-71a4-4a6b-a010-d038221d39c7" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 613.342262] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] Lock "2dd635c4-71a4-4a6b-a010-d038221d39c7" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 613.366844] env[62277]: INFO dbcounter [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] Registered counter for database nova_cell0 [ 613.375309] env[62277]: INFO dbcounter [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] Registered counter for database nova_cell1 [ 613.378388] env[62277]: DEBUG oslo_db.sqlalchemy.engines [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=62277) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 613.378991] env[62277]: DEBUG oslo_db.sqlalchemy.engines [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=62277) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 613.383348] env[62277]: DEBUG dbcounter [-] [62277] Writer thread running {{(pid=62277) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 613.384477] env[62277]: DEBUG dbcounter [-] [62277] Writer thread running {{(pid=62277) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 613.386301] env[62277]: ERROR nova.db.main.api [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 613.386301] env[62277]: result = function(*args, **kwargs) [ 613.386301] env[62277]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 613.386301] env[62277]: return func(*args, **kwargs) [ 613.386301] env[62277]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 613.386301] env[62277]: result = fn(*args, **kwargs) [ 613.386301] env[62277]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 613.386301] env[62277]: return f(*args, **kwargs) [ 613.386301] env[62277]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 613.386301] env[62277]: return db.service_get_minimum_version(context, binaries) [ 613.386301] env[62277]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 613.386301] env[62277]: _check_db_access() [ 613.386301] env[62277]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 613.386301] env[62277]: stacktrace = ''.join(traceback.format_stack()) [ 613.386301] env[62277]: [ 613.387360] env[62277]: ERROR nova.db.main.api [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 613.387360] env[62277]: result = function(*args, **kwargs) [ 613.387360] env[62277]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 613.387360] env[62277]: return func(*args, **kwargs) [ 613.387360] env[62277]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 613.387360] env[62277]: result = fn(*args, **kwargs) [ 613.387360] env[62277]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 613.387360] env[62277]: return f(*args, **kwargs) [ 613.387360] env[62277]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 613.387360] env[62277]: return db.service_get_minimum_version(context, binaries) [ 613.387360] env[62277]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 613.387360] env[62277]: _check_db_access() [ 613.387360] env[62277]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 613.387360] env[62277]: stacktrace = ''.join(traceback.format_stack()) [ 613.387360] env[62277]: [ 613.387971] env[62277]: WARNING nova.objects.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] Failed to get minimum service version for cell 2dd635c4-71a4-4a6b-a010-d038221d39c7 [ 613.387971] env[62277]: WARNING nova.objects.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 613.388357] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] Acquiring lock "singleton_lock" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 613.388516] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] Acquired lock "singleton_lock" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 613.388769] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] Releasing lock "singleton_lock" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 613.389095] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] Full set of CONF: {{(pid=62277) _wait_for_exit_or_signal /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/service.py:362}} [ 613.389240] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ******************************************************************************** {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2589}} [ 613.389366] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] Configuration options gathered from: {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2590}} [ 613.389499] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2591}} [ 613.389682] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2592}} [ 613.389811] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ================================================================================ {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2594}} [ 613.390098] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] allow_resize_to_same_host = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.390284] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] arq_binding_timeout = 300 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.390415] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] backdoor_port = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.390541] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] backdoor_socket = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.390706] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] block_device_allocate_retries = 60 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.390877] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] block_device_allocate_retries_interval = 3 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.391062] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cert = self.pem {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.391234] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.391402] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] compute_monitors = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.391571] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] config_dir = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.391738] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] config_drive_format = iso9660 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.391873] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.392046] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] config_source = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.392219] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] console_host = devstack {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.392382] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] control_exchange = nova {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.392540] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cpu_allocation_ratio = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.392696] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] daemon = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.392860] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] debug = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.393023] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] default_access_ip_network_name = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.393195] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] default_availability_zone = nova {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.393347] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] default_ephemeral_format = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.393502] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] default_green_pool_size = 1000 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.393742] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.393909] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] default_schedule_zone = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.394081] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] disk_allocation_ratio = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.394246] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] enable_new_services = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.394422] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] enabled_apis = ['osapi_compute'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.394584] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] enabled_ssl_apis = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.394743] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] flat_injected = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.394902] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] force_config_drive = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.395079] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] force_raw_images = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.395245] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] graceful_shutdown_timeout = 5 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.395409] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] heal_instance_info_cache_interval = 60 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.395621] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] host = cpu-1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.395795] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.395957] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] initial_disk_allocation_ratio = 1.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.396134] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] initial_ram_allocation_ratio = 1.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.396355] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.396519] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] instance_build_timeout = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.396677] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] instance_delete_interval = 300 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.396844] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] instance_format = [instance: %(uuid)s] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.397015] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] instance_name_template = instance-%08x {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.397183] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] instance_usage_audit = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.397351] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] instance_usage_audit_period = month {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.397517] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.397717] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] instances_path = /opt/stack/data/nova/instances {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.397884] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] internal_service_availability_zone = internal {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.398056] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] key = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.398224] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] live_migration_retry_count = 30 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.398388] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] log_config_append = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.398558] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.398726] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] log_dir = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.398885] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] log_file = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.399017] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] log_options = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.399186] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] log_rotate_interval = 1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.399353] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] log_rotate_interval_type = days {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.399518] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] log_rotation_type = none {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.399647] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.399775] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.399945] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.400119] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.400249] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.400413] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] long_rpc_timeout = 1800 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.400571] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] max_concurrent_builds = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.400727] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] max_concurrent_live_migrations = 1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.400883] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] max_concurrent_snapshots = 5 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.401048] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] max_local_block_devices = 3 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.401211] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] max_logfile_count = 30 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.401367] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] max_logfile_size_mb = 200 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.401526] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] maximum_instance_delete_attempts = 5 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.401690] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] metadata_listen = 0.0.0.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.401857] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] metadata_listen_port = 8775 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.402034] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] metadata_workers = 2 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.402199] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] migrate_max_retries = -1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.402367] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] mkisofs_cmd = genisoimage {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.402571] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] my_block_storage_ip = 10.180.1.21 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.402702] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] my_ip = 10.180.1.21 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.402865] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] network_allocate_retries = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.403052] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.403226] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] osapi_compute_listen = 0.0.0.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.403386] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] osapi_compute_listen_port = 8774 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.403552] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] osapi_compute_unique_server_name_scope = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.403719] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] osapi_compute_workers = 2 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.403880] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] password_length = 12 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.404053] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] periodic_enable = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.404217] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] periodic_fuzzy_delay = 60 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.404384] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] pointer_model = usbtablet {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.404549] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] preallocate_images = none {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.404707] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] publish_errors = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.404836] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] pybasedir = /opt/stack/nova {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.404993] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ram_allocation_ratio = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.405166] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] rate_limit_burst = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.405330] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] rate_limit_except_level = CRITICAL {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.405489] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] rate_limit_interval = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.405643] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] reboot_timeout = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.405801] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] reclaim_instance_interval = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.405954] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] record = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.406132] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] reimage_timeout_per_gb = 60 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.406297] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] report_interval = 120 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.406454] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] rescue_timeout = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.406610] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] reserved_host_cpus = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.406765] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] reserved_host_disk_mb = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.406918] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] reserved_host_memory_mb = 512 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.407086] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] reserved_huge_pages = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.407248] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] resize_confirm_window = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.407404] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] resize_fs_using_block_device = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.407559] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] resume_guests_state_on_host_boot = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.407729] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.407889] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] rpc_response_timeout = 60 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.408057] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] run_external_periodic_tasks = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.408230] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] running_deleted_instance_action = reap {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.408389] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] running_deleted_instance_poll_interval = 1800 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.408545] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] running_deleted_instance_timeout = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.408702] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] scheduler_instance_sync_interval = 120 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.408870] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] service_down_time = 720 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.409046] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] servicegroup_driver = db {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.409210] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] shelved_offload_time = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.409369] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] shelved_poll_interval = 3600 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.409532] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] shutdown_timeout = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.409694] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] source_is_ipv6 = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.409852] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ssl_only = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.410111] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.410280] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] sync_power_state_interval = 600 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.410441] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] sync_power_state_pool_size = 1000 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.410607] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] syslog_log_facility = LOG_USER {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.410767] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] tempdir = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.410925] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] timeout_nbd = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.411103] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] transport_url = **** {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.411267] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] update_resources_interval = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.411424] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] use_cow_images = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.411582] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] use_eventlog = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.411738] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] use_journal = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.411894] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] use_json = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.412061] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] use_rootwrap_daemon = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.412223] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] use_stderr = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.412381] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] use_syslog = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.412539] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vcpu_pin_set = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.412705] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vif_plugging_is_fatal = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.412872] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vif_plugging_timeout = 300 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.413045] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] virt_mkfs = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.413210] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] volume_usage_poll_interval = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.413367] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] watch_log_file = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.413534] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] web = /usr/share/spice-html5 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 613.413723] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_concurrency.disable_process_locking = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.414027] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.414207] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.414377] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.414547] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.414714] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.414883] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.415076] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.auth_strategy = keystone {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.415247] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.compute_link_prefix = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.415422] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.415595] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.dhcp_domain = novalocal {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.415766] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.enable_instance_password = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.415933] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.glance_link_prefix = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.416110] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.416286] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.416449] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.instance_list_per_project_cells = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.416611] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.list_records_by_skipping_down_cells = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.416775] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.local_metadata_per_cell = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.416940] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.max_limit = 1000 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.417117] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.metadata_cache_expiration = 15 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.417297] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.neutron_default_tenant_id = default {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.417464] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.use_forwarded_for = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.417633] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.use_neutron_default_nets = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.417801] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.417967] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.418150] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.418324] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.418495] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.vendordata_dynamic_targets = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.418658] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.vendordata_jsonfile_path = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.418836] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.419034] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.backend = dogpile.cache.memcached {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.419210] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.backend_argument = **** {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.419382] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.config_prefix = cache.oslo {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.419551] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.dead_timeout = 60.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.419716] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.debug_cache_backend = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.419881] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.enable_retry_client = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.420055] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.enable_socket_keepalive = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.420229] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.enabled = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.420396] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.expiration_time = 600 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.420561] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.hashclient_retry_attempts = 2 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.420727] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.hashclient_retry_delay = 1.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.420893] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.memcache_dead_retry = 300 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.421073] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.memcache_password = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.421242] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.421406] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.421570] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.memcache_pool_maxsize = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.421734] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.421896] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.memcache_sasl_enabled = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.422089] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.422261] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.memcache_socket_timeout = 1.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.422431] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.memcache_username = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.422599] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.proxies = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.422763] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.retry_attempts = 2 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.422934] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.retry_delay = 0.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.423108] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.socket_keepalive_count = 1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.423272] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.socket_keepalive_idle = 1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.423432] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.socket_keepalive_interval = 1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.423589] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.tls_allowed_ciphers = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.423747] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.tls_cafile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.423905] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.tls_certfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.424078] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.tls_enabled = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.424240] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cache.tls_keyfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.424409] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cinder.auth_section = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.424585] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cinder.auth_type = password {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.424747] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cinder.cafile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.424923] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cinder.catalog_info = volumev3::publicURL {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.425097] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cinder.certfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.425262] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cinder.collect_timing = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.425424] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cinder.cross_az_attach = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.425585] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cinder.debug = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.425743] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cinder.endpoint_template = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.425908] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cinder.http_retries = 3 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.426081] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cinder.insecure = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.426243] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cinder.keyfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.426411] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cinder.os_region_name = RegionOne {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.426604] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cinder.split_loggers = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.426736] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cinder.timeout = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.426905] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.427075] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] compute.cpu_dedicated_set = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.427237] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] compute.cpu_shared_set = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.427403] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] compute.image_type_exclude_list = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.427563] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.427728] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] compute.max_concurrent_disk_ops = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.427887] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] compute.max_disk_devices_to_attach = -1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.428058] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.428233] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.428395] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] compute.resource_provider_association_refresh = 300 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.428557] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] compute.shutdown_retry_interval = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.428736] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.428914] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] conductor.workers = 2 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.429099] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] console.allowed_origins = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.429264] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] console.ssl_ciphers = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.429435] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] console.ssl_minimum_version = default {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.429607] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] consoleauth.token_ttl = 600 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.429778] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cyborg.cafile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.429937] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cyborg.certfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.430116] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cyborg.collect_timing = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.430279] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cyborg.connect_retries = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.430440] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cyborg.connect_retry_delay = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.430600] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cyborg.endpoint_override = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.430763] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cyborg.insecure = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.430923] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cyborg.keyfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.431093] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cyborg.max_version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.431259] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cyborg.min_version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.431421] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cyborg.region_name = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.431578] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cyborg.service_name = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.431747] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cyborg.service_type = accelerator {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.431910] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cyborg.split_loggers = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.432080] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cyborg.status_code_retries = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.432243] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cyborg.status_code_retry_delay = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.432400] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cyborg.timeout = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.432578] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.432737] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] cyborg.version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.432921] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.backend = sqlalchemy {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.433108] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.connection = **** {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.433283] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.connection_debug = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.433451] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.connection_parameters = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.433615] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.connection_recycle_time = 3600 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.433782] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.connection_trace = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.433943] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.db_inc_retry_interval = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.434118] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.db_max_retries = 20 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.434282] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.db_max_retry_interval = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.434442] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.db_retry_interval = 1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.434645] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.max_overflow = 50 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.434816] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.max_pool_size = 5 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.434985] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.max_retries = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.435173] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.436454] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.mysql_wsrep_sync_wait = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.436454] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.pool_timeout = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.436454] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.retry_interval = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.436454] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.slave_connection = **** {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.436454] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.sqlite_synchronous = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.436454] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] database.use_db_reconnect = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.436702] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api_database.backend = sqlalchemy {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.436702] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api_database.connection = **** {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.436702] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api_database.connection_debug = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.436883] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api_database.connection_parameters = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.437064] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api_database.connection_recycle_time = 3600 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.437235] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api_database.connection_trace = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.437398] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api_database.db_inc_retry_interval = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.437560] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api_database.db_max_retries = 20 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.437729] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api_database.db_max_retry_interval = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.437891] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api_database.db_retry_interval = 1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.438073] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api_database.max_overflow = 50 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.438238] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api_database.max_pool_size = 5 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.438404] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api_database.max_retries = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.438572] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.438733] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.438894] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api_database.pool_timeout = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.439073] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api_database.retry_interval = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.439237] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api_database.slave_connection = **** {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.439402] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] api_database.sqlite_synchronous = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.439577] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] devices.enabled_mdev_types = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.439752] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.439919] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ephemeral_storage_encryption.enabled = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.440096] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.440269] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.api_servers = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.440431] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.cafile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.440592] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.certfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.440757] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.collect_timing = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.440920] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.connect_retries = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.441092] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.connect_retry_delay = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.441258] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.debug = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.441423] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.default_trusted_certificate_ids = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.441587] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.enable_certificate_validation = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.441749] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.enable_rbd_download = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.441910] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.endpoint_override = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.442086] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.insecure = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.442253] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.keyfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.442413] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.max_version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.442570] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.min_version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.442736] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.num_retries = 3 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.442906] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.rbd_ceph_conf = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.443079] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.rbd_connect_timeout = 5 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.443251] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.rbd_pool = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.443419] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.rbd_user = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.443578] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.region_name = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.443737] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.service_name = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.443906] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.service_type = image {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.444077] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.split_loggers = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.444239] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.status_code_retries = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.444396] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.status_code_retry_delay = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.444553] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.timeout = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.444735] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.444913] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.verify_glance_signatures = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.445089] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] glance.version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.445260] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] guestfs.debug = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.445431] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] hyperv.config_drive_cdrom = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.445593] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] hyperv.config_drive_inject_password = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.445763] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.445927] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] hyperv.enable_instance_metrics_collection = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.446100] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] hyperv.enable_remotefx = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.446272] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] hyperv.instances_path_share = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.446443] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] hyperv.iscsi_initiator_list = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.446606] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] hyperv.limit_cpu_features = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.446771] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.446932] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.447106] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] hyperv.power_state_check_timeframe = 60 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.447276] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.447446] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.447617] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] hyperv.use_multipath_io = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.447779] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] hyperv.volume_attach_retry_count = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.447936] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.448105] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] hyperv.vswitch_name = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.448269] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.448434] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] mks.enabled = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.448795] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.448988] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] image_cache.manager_interval = 2400 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.449175] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] image_cache.precache_concurrency = 1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.449346] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] image_cache.remove_unused_base_images = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.449517] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.449690] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.449870] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] image_cache.subdirectory_name = _base {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.450059] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.api_max_retries = 60 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.450225] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.api_retry_interval = 2 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.450386] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.auth_section = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.450546] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.auth_type = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.450706] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.cafile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.450866] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.certfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.451037] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.collect_timing = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.451205] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.conductor_group = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.451365] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.connect_retries = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.451522] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.connect_retry_delay = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.451681] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.endpoint_override = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.451849] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.insecure = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.452017] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.keyfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.452182] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.max_version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.452340] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.min_version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.452505] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.peer_list = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.452666] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.region_name = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.452833] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.serial_console_state_timeout = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.452992] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.service_name = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.453177] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.service_type = baremetal {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.453343] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.split_loggers = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.453501] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.status_code_retries = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.453661] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.status_code_retry_delay = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.453819] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.timeout = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.453999] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.454177] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ironic.version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.454360] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.454533] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] key_manager.fixed_key = **** {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.454712] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.454877] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican.barbican_api_version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.455048] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican.barbican_endpoint = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.455230] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican.barbican_endpoint_type = public {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.455389] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican.barbican_region_name = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.455547] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican.cafile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.455704] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican.certfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.455867] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican.collect_timing = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.456038] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican.insecure = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.456200] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican.keyfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.456364] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican.number_of_retries = 60 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.456524] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican.retry_delay = 1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.456685] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican.send_service_user_token = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.456849] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican.split_loggers = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.457013] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican.timeout = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.457185] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican.verify_ssl = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.457345] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican.verify_ssl_path = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.457525] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican_service_user.auth_section = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.457695] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican_service_user.auth_type = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.457854] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican_service_user.cafile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.458025] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican_service_user.certfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.458189] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican_service_user.collect_timing = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.458351] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican_service_user.insecure = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.458508] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican_service_user.keyfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.458673] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican_service_user.split_loggers = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.458833] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] barbican_service_user.timeout = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.459031] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vault.approle_role_id = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.459199] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vault.approle_secret_id = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.459359] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vault.cafile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.459517] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vault.certfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.459682] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vault.collect_timing = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.459846] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vault.insecure = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.460017] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vault.keyfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.460213] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vault.kv_mountpoint = secret {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.460381] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vault.kv_path = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.460564] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vault.kv_version = 2 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.460727] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vault.namespace = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.460889] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vault.root_token_id = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.461063] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vault.split_loggers = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.461234] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vault.ssl_ca_crt_file = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.461384] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vault.timeout = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.461545] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vault.use_ssl = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.461715] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.461888] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.auth_section = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.462062] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.auth_type = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.462229] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.cafile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.462389] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.certfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.462552] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.collect_timing = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.462710] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.connect_retries = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.462873] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.connect_retry_delay = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.463041] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.endpoint_override = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.463208] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.insecure = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.463369] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.keyfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.463530] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.max_version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.463690] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.min_version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.463852] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.region_name = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.464015] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.service_name = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.464193] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.service_type = identity {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.464355] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.split_loggers = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.464514] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.status_code_retries = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.464674] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.status_code_retry_delay = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.464833] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.timeout = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.465023] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.465188] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] keystone.version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.465387] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.connection_uri = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.465547] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.cpu_mode = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.465714] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.cpu_model_extra_flags = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.465884] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.cpu_models = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.466065] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.cpu_power_governor_high = performance {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.466242] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.cpu_power_governor_low = powersave {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.466408] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.cpu_power_management = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.466580] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.466745] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.device_detach_attempts = 8 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.466908] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.device_detach_timeout = 20 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.467093] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.disk_cachemodes = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.467259] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.disk_prefix = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.467427] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.enabled_perf_events = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.467592] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.file_backed_memory = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.467762] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.gid_maps = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.467922] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.hw_disk_discard = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.468091] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.hw_machine_type = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.468262] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.images_rbd_ceph_conf = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.468424] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.468592] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.468757] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.images_rbd_glance_store_name = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.468924] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.images_rbd_pool = rbd {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.469107] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.images_type = default {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.469270] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.images_volume_group = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.469432] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.inject_key = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.469593] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.inject_partition = -2 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.469752] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.inject_password = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.469914] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.iscsi_iface = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.470085] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.iser_use_multipath = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.470253] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.live_migration_bandwidth = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.470415] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.470577] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.live_migration_downtime = 500 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.470739] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.470907] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.471083] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.live_migration_inbound_addr = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.471252] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.471415] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.live_migration_permit_post_copy = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.471576] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.live_migration_scheme = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.471749] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.live_migration_timeout_action = abort {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.471915] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.live_migration_tunnelled = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.472085] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.live_migration_uri = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.472257] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.live_migration_with_native_tls = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.472419] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.max_queues = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.472582] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.472741] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.nfs_mount_options = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.473057] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.473234] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.473401] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.num_iser_scan_tries = 5 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.473562] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.num_memory_encrypted_guests = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.473726] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.473891] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.num_pcie_ports = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.474069] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.num_volume_scan_tries = 5 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.474239] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.pmem_namespaces = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.474400] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.quobyte_client_cfg = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.474681] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.474863] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.rbd_connect_timeout = 5 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.475027] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.475199] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.475358] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.rbd_secret_uuid = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.475516] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.rbd_user = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.475678] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.475853] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.remote_filesystem_transport = ssh {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.476021] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.rescue_image_id = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.476184] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.rescue_kernel_id = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.476342] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.rescue_ramdisk_id = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.476512] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.476671] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.rx_queue_size = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.476840] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.smbfs_mount_options = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.477128] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.477304] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.snapshot_compression = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.477465] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.snapshot_image_format = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.477698] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.477881] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.sparse_logical_volumes = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.478059] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.swtpm_enabled = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.478236] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.swtpm_group = tss {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.478407] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.swtpm_user = tss {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.478580] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.sysinfo_serial = unique {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.478741] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.tb_cache_size = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.478902] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.tx_queue_size = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.479080] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.uid_maps = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.479249] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.use_virtio_for_bridges = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.479423] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.virt_type = kvm {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.479593] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.volume_clear = zero {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.479757] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.volume_clear_size = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.479926] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.volume_use_multipath = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.480095] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.vzstorage_cache_path = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.480265] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.480430] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.vzstorage_mount_group = qemu {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.480594] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.vzstorage_mount_opts = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.480760] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.481042] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.481223] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.vzstorage_mount_user = stack {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.481389] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.481557] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.auth_section = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.481730] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.auth_type = password {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.481890] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.cafile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.482059] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.certfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.482226] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.collect_timing = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.482383] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.connect_retries = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.482541] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.connect_retry_delay = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.482709] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.default_floating_pool = public {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.482871] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.endpoint_override = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.483046] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.extension_sync_interval = 600 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.483213] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.http_retries = 3 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.483377] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.insecure = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.483536] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.keyfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.483692] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.max_version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.483863] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.484030] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.min_version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.484201] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.ovs_bridge = br-int {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.484366] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.physnets = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.484534] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.region_name = RegionOne {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.484700] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.service_metadata_proxy = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.484859] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.service_name = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.485032] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.service_type = network {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.485199] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.split_loggers = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.485357] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.status_code_retries = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.485516] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.status_code_retry_delay = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.485673] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.timeout = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.485853] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.486026] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] neutron.version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.486197] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] notifications.bdms_in_notifications = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.486376] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] notifications.default_level = INFO {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.486547] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] notifications.notification_format = unversioned {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.486714] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] notifications.notify_on_state_change = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.486895] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.487084] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] pci.alias = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.487259] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] pci.device_spec = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.487424] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] pci.report_in_placement = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.487595] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.auth_section = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.487770] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.auth_type = password {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.487939] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.488111] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.cafile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.488275] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.certfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.488437] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.collect_timing = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.488599] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.connect_retries = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.488755] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.connect_retry_delay = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.488915] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.default_domain_id = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.489084] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.default_domain_name = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.489244] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.domain_id = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.489400] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.domain_name = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.489557] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.endpoint_override = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.489716] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.insecure = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.489872] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.keyfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.490035] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.max_version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.490194] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.min_version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.490358] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.password = **** {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.490514] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.project_domain_id = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.490673] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.project_domain_name = Default {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.490836] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.project_id = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.491011] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.project_name = service {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.491187] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.region_name = RegionOne {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.491344] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.service_name = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.491508] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.service_type = placement {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.491666] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.split_loggers = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.491824] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.status_code_retries = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.491983] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.status_code_retry_delay = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.492155] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.system_scope = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.492311] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.timeout = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.492471] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.trust_id = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.492626] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.user_domain_id = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.492793] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.user_domain_name = Default {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.492949] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.user_id = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.493135] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.username = placement {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.493315] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.493472] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] placement.version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.493645] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] quota.cores = 20 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.493808] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] quota.count_usage_from_placement = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.493976] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.494159] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] quota.injected_file_content_bytes = 10240 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.494325] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] quota.injected_file_path_length = 255 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.494491] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] quota.injected_files = 5 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.494653] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] quota.instances = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.494820] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] quota.key_pairs = 100 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.494984] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] quota.metadata_items = 128 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.495161] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] quota.ram = 51200 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.495325] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] quota.recheck_quota = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.495501] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] quota.server_group_members = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.495666] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] quota.server_groups = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.495836] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] rdp.enabled = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.496158] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.496346] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.496514] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.496679] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] scheduler.image_metadata_prefilter = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.496843] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.497012] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] scheduler.max_attempts = 3 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.497183] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] scheduler.max_placement_results = 1000 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.497348] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.497509] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] scheduler.query_placement_for_image_type_support = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.497674] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.497846] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] scheduler.workers = 2 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.498024] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.498198] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.498378] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.498550] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.498716] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.498882] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.499057] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.499251] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.499422] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.host_subset_size = 1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.499590] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.499750] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.499917] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.500093] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.isolated_hosts = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.500260] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.isolated_images = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.500422] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.500581] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.500742] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.num_instances_weight_multiplier = 0.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.500905] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.pci_in_placement = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.501076] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.501242] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.501408] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.501569] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.501731] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.501893] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.502067] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.track_instance_changes = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.502248] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.502418] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] metrics.required = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.502583] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] metrics.weight_multiplier = 1.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.502749] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.502915] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] metrics.weight_setting = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.503266] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.503452] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] serial_console.enabled = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.503631] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] serial_console.port_range = 10000:20000 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.503807] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.504015] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.504196] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] serial_console.serialproxy_port = 6083 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.504367] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] service_user.auth_section = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.504540] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] service_user.auth_type = password {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.504704] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] service_user.cafile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.504866] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] service_user.certfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.505040] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] service_user.collect_timing = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.505207] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] service_user.insecure = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.505366] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] service_user.keyfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.505550] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] service_user.send_service_user_token = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.505715] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] service_user.split_loggers = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.505877] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] service_user.timeout = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.506059] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] spice.agent_enabled = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.506227] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] spice.enabled = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.506525] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.506717] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.506887] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] spice.html5proxy_port = 6082 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.507059] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] spice.image_compression = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.507225] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] spice.jpeg_compression = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.507384] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] spice.playback_compression = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.507554] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] spice.server_listen = 127.0.0.1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.507725] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.507885] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] spice.streaming_mode = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.508053] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] spice.zlib_compression = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.508223] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] upgrade_levels.baseapi = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.508383] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] upgrade_levels.cert = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.508552] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] upgrade_levels.compute = auto {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.508713] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] upgrade_levels.conductor = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.508872] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] upgrade_levels.scheduler = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.509050] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vendordata_dynamic_auth.auth_section = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.509218] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vendordata_dynamic_auth.auth_type = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.509379] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vendordata_dynamic_auth.cafile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.509538] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vendordata_dynamic_auth.certfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.509702] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.509865] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vendordata_dynamic_auth.insecure = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.510034] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vendordata_dynamic_auth.keyfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.510201] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.510359] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vendordata_dynamic_auth.timeout = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.510529] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.api_retry_count = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.510690] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.ca_file = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.510863] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.cache_prefix = devstack-image-cache {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.511039] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.cluster_name = testcl1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.511212] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.connection_pool_size = 10 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.511372] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.console_delay_seconds = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.511539] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.datastore_regex = ^datastore.* {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.511747] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.511921] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.host_password = **** {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.512099] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.host_port = 443 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.512270] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.host_username = administrator@vsphere.local {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.512437] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.insecure = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.512600] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.integration_bridge = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.512762] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.maximum_objects = 100 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.512922] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.pbm_default_policy = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.513095] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.pbm_enabled = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.513256] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.pbm_wsdl_location = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.513422] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.513582] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.serial_port_proxy_uri = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.513740] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.serial_port_service_uri = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.513911] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.task_poll_interval = 0.5 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.514092] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.use_linked_clone = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.514266] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.vnc_keymap = en-us {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.514431] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.vnc_port = 5900 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.514593] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vmware.vnc_port_total = 10000 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.514779] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vnc.auth_schemes = ['none'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.514956] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vnc.enabled = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.515264] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.515452] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.515623] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vnc.novncproxy_port = 6080 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.515800] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vnc.server_listen = 127.0.0.1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.515972] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.516146] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vnc.vencrypt_ca_certs = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.516305] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vnc.vencrypt_client_cert = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.516462] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vnc.vencrypt_client_key = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.516632] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.516796] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.disable_deep_image_inspection = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.516956] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.517130] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.517291] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.517453] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.disable_rootwrap = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.517615] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.enable_numa_live_migration = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.517772] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.517935] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.518108] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.518270] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.libvirt_disable_apic = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.518431] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.518594] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.518753] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.518912] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.519084] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.519247] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.519405] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.519565] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.519723] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.519887] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.520080] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.520251] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] wsgi.client_socket_timeout = 900 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.520418] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] wsgi.default_pool_size = 1000 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.520585] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] wsgi.keep_alive = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.520750] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] wsgi.max_header_line = 16384 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.520913] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] wsgi.secure_proxy_ssl_header = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.521085] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] wsgi.ssl_ca_file = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.521250] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] wsgi.ssl_cert_file = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.521411] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] wsgi.ssl_key_file = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.521576] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] wsgi.tcp_keepidle = 600 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.521750] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.521918] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] zvm.ca_file = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.522092] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] zvm.cloud_connector_url = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.522374] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.522547] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] zvm.reachable_timeout = 300 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.522724] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_policy.enforce_new_defaults = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.522895] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_policy.enforce_scope = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.523084] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_policy.policy_default_rule = default {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.523271] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.523448] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_policy.policy_file = policy.yaml {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.523620] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.523783] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.523945] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.524115] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.524280] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.524449] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.524623] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.524802] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] profiler.connection_string = messaging:// {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.524967] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] profiler.enabled = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.525150] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] profiler.es_doc_type = notification {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.525316] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] profiler.es_scroll_size = 10000 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.525484] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] profiler.es_scroll_time = 2m {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.525647] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] profiler.filter_error_trace = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.525816] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] profiler.hmac_keys = **** {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.525981] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] profiler.sentinel_service_name = mymaster {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.526162] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] profiler.socket_timeout = 0.1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.526325] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] profiler.trace_requests = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.526485] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] profiler.trace_sqlalchemy = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.526660] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] profiler_jaeger.process_tags = {} {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.526822] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] profiler_jaeger.service_name_prefix = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.526982] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] profiler_otlp.service_name_prefix = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.527161] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] remote_debug.host = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.527320] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] remote_debug.port = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.527497] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.527665] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.527831] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.527993] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.528171] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.528334] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.528494] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.528655] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.528816] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.528974] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.529156] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.529323] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.529492] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.529660] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.529822] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.529994] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.530169] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.530332] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.530496] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.530658] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.530821] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.530987] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.531162] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.531323] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.531487] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.531653] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.ssl = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.531827] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.531998] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.532174] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.532346] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.532515] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_rabbit.ssl_version = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.532700] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.532868] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_notifications.retry = -1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.533061] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.533243] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_messaging_notifications.transport_url = **** {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.533557] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.auth_section = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.533757] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.auth_type = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.533926] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.cafile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.534111] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.certfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.534282] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.collect_timing = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.534443] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.connect_retries = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.534605] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.connect_retry_delay = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.534767] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.endpoint_id = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.534927] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.endpoint_override = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.535104] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.insecure = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.535266] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.keyfile = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.535425] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.max_version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.535581] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.min_version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.535737] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.region_name = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.535894] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.service_name = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.536059] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.service_type = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.536224] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.split_loggers = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.536382] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.status_code_retries = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.536541] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.status_code_retry_delay = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.536697] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.timeout = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.536857] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.valid_interfaces = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.537024] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_limit.version = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.537196] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_reports.file_event_handler = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.537362] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.537522] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] oslo_reports.log_dir = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.537696] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.537859] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.538029] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.538204] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.538370] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.538529] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.538702] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.538862] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vif_plug_ovs_privileged.group = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.539030] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.539204] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.539365] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.539524] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] vif_plug_ovs_privileged.user = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.539693] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] os_vif_linux_bridge.flat_interface = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.539873] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.540057] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.540234] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.540405] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.540572] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.540740] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.540904] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.541096] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] os_vif_ovs.default_qos_type = linux-noop {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.541271] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] os_vif_ovs.isolate_vif = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.541439] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.541604] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.541771] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.541940] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] os_vif_ovs.ovsdb_interface = native {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.542114] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] os_vif_ovs.per_port_bridge = False {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.542284] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] os_brick.lock_path = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.542453] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] privsep_osbrick.capabilities = [21] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.542614] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] privsep_osbrick.group = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.542775] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] privsep_osbrick.helper_command = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.542940] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.543119] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.543283] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] privsep_osbrick.user = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.543458] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.543618] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] nova_sys_admin.group = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.543777] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] nova_sys_admin.helper_command = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.543941] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.544115] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.544276] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] nova_sys_admin.user = None {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 613.544405] env[62277]: DEBUG oslo_service.service [None req-ca8201dd-7bca-4361-ac67-c5756c171c33 None None] ******************************************************************************** {{(pid=62277) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2613}} [ 613.544825] env[62277]: INFO nova.service [-] Starting compute node (version 0.1.0) [ 613.554289] env[62277]: WARNING nova.virt.vmwareapi.driver [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] The vmwareapi driver is not tested by the OpenStack project nor does it have clear maintainer(s) and thus its quality can not be ensured. It should be considered experimental and may be removed in a future release. If you are using the driver in production please let us know via the openstack-discuss mailing list. [ 613.554725] env[62277]: INFO nova.virt.node [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Generated node identity 75e125ea-a599-4b65-b9cd-6ea881735292 [ 613.554945] env[62277]: INFO nova.virt.node [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Wrote node identity 75e125ea-a599-4b65-b9cd-6ea881735292 to /opt/stack/data/n-cpu-1/compute_id [ 613.568460] env[62277]: WARNING nova.compute.manager [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Compute nodes ['75e125ea-a599-4b65-b9cd-6ea881735292'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 613.602281] env[62277]: INFO nova.compute.manager [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 613.623467] env[62277]: WARNING nova.compute.manager [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 613.623694] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 613.623883] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 613.624042] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 613.624219] env[62277]: DEBUG nova.compute.resource_tracker [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 613.625358] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb0eaef2-9097-4982-973e-1ad16fe31fd9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 613.633740] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1d15448-5fdc-4c04-b014-dc4a436ea7b0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 613.647206] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3936435d-a026-4822-a89b-08987bc4141e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 613.653070] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bf07021-6903-4e69-8b31-3ae253481dc3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 613.682676] env[62277]: DEBUG nova.compute.resource_tracker [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181446MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 613.682773] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 613.682949] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 613.693986] env[62277]: WARNING nova.compute.resource_tracker [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] No compute node record for cpu-1:75e125ea-a599-4b65-b9cd-6ea881735292: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 75e125ea-a599-4b65-b9cd-6ea881735292 could not be found. [ 613.706201] env[62277]: INFO nova.compute.resource_tracker [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 75e125ea-a599-4b65-b9cd-6ea881735292 [ 613.754387] env[62277]: DEBUG nova.compute.resource_tracker [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 613.754610] env[62277]: DEBUG nova.compute.resource_tracker [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 613.859961] env[62277]: INFO nova.scheduler.client.report [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] [req-2c0d7688-d030-4588-9861-39f9fc64fa01] Created resource provider record via placement API for resource provider with UUID 75e125ea-a599-4b65-b9cd-6ea881735292 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 613.875858] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b838216c-8a19-4338-9f62-c5342b331184 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 613.883526] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-279089d5-14f7-445d-8f5f-51317dc5cba0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 613.914298] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c9e1105-36a7-460a-bac6-2db1e4b86930 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 613.921165] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39333187-8244-43f2-91a0-0ef00e06d64a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 613.933809] env[62277]: DEBUG nova.compute.provider_tree [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Updating inventory in ProviderTree for provider 75e125ea-a599-4b65-b9cd-6ea881735292 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 613.972179] env[62277]: DEBUG nova.scheduler.client.report [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Updated inventory for provider 75e125ea-a599-4b65-b9cd-6ea881735292 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 613.972409] env[62277]: DEBUG nova.compute.provider_tree [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Updating resource provider 75e125ea-a599-4b65-b9cd-6ea881735292 generation from 0 to 1 during operation: update_inventory {{(pid=62277) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 613.972553] env[62277]: DEBUG nova.compute.provider_tree [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Updating inventory in ProviderTree for provider 75e125ea-a599-4b65-b9cd-6ea881735292 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 614.024483] env[62277]: DEBUG nova.compute.provider_tree [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Updating resource provider 75e125ea-a599-4b65-b9cd-6ea881735292 generation from 1 to 2 during operation: update_traits {{(pid=62277) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 614.044080] env[62277]: DEBUG nova.compute.resource_tracker [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 614.044280] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.361s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 614.044433] env[62277]: DEBUG nova.service [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Creating RPC server for service compute {{(pid=62277) start /opt/stack/nova/nova/service.py:182}} [ 614.065508] env[62277]: DEBUG nova.service [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] Join ServiceGroup membership for this service compute {{(pid=62277) start /opt/stack/nova/nova/service.py:199}} [ 614.065742] env[62277]: DEBUG nova.servicegroup.drivers.db [None req-e84e8df3-2b0a-4939-8ca6-1ba76bf2d6bd None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=62277) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 623.385496] env[62277]: DEBUG dbcounter [-] [62277] Writing DB stats nova_cell1:SELECT=1 {{(pid=62277) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 623.387385] env[62277]: DEBUG dbcounter [-] [62277] Writing DB stats nova_cell0:SELECT=1 {{(pid=62277) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 648.067782] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_power_states {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 648.079144] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Getting list of instances from cluster (obj){ [ 648.079144] env[62277]: value = "domain-c8" [ 648.079144] env[62277]: _type = "ClusterComputeResource" [ 648.079144] env[62277]: } {{(pid=62277) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 648.080282] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea40726d-b177-4054-9d62-84221876125e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 648.089570] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Got total of 0 instances {{(pid=62277) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 648.089796] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 648.090160] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Getting list of instances from cluster (obj){ [ 648.090160] env[62277]: value = "domain-c8" [ 648.090160] env[62277]: _type = "ClusterComputeResource" [ 648.090160] env[62277]: } {{(pid=62277) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 648.090966] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08047273-576b-4d83-9c3b-0b8c8789f4f3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 648.098745] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Got total of 0 instances {{(pid=62277) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 670.178016] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 670.178479] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 670.178623] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 670.178752] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 670.189549] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 670.189732] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 670.189950] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 670.190169] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 670.190604] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 670.190816] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 670.191020] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 670.191196] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 670.191349] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 670.201358] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 670.201567] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 670.201730] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 670.201884] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 670.202930] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01e00c86-5deb-44c3-9e74-8515c5ad9049 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.211577] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17723019-1261-45c1-b46b-cecf63bb9804 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.225967] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1115ac88-0f1d-435b-8acb-cc0a4aae6eba {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.232137] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5cbf1a7c-0e47-4c50-8f95-c7178762605f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.260255] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181440MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 670.260391] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 670.260562] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 670.288907] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 670.289076] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 670.302676] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20e7bf1c-0ed1-4f0d-96f2-c3cafc64026a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.309821] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-187342eb-7e30-4124-905b-e2fffd79e9e5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.339555] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa473b5c-fcd9-474e-a31d-73860120c756 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.346696] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e782022d-4a9e-4d48-a4ef-4f11bd3c6921 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 670.359457] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 670.367589] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 670.368750] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 670.368915] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.108s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 730.354720] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 730.364549] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 730.364724] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 730.364838] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 730.372295] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 730.372506] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 730.372664] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 731.169310] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 732.164547] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 732.168300] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 732.168623] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 732.168726] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 732.168818] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 732.168951] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 732.178815] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 732.179076] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 732.179249] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 732.179411] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 732.180603] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-624cfa7a-cf49-4959-a13c-60b13251e6c1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.189008] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bf49ea5-9a0b-43c5-b166-4ac01d0fcfd8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.202380] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09de10d3-19cd-4e0e-9637-ba1eccfe0187 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.208291] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-774f09fc-292e-4a9b-bccd-073c78d5d5e4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.237218] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181436MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 732.237381] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 732.237567] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 732.268461] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 732.268461] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 732.279665] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdefb079-4218-472d-8512-ccac755249f3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.286363] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b758153f-dcf1-40d4-8d22-7d61bc3e9449 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.315846] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbbb0f71-4a9c-4698-a043-2abedfdf0bb5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.322986] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75b7319c-79e4-4137-90a1-4ada84ea36aa {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.335676] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 732.346284] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 732.347395] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 732.347563] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.110s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 790.347770] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 790.348192] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 790.348192] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 790.357412] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 791.168063] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 792.164496] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 792.168564] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 792.168564] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 792.168564] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 792.168564] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 793.170112] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 793.179336] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 793.179575] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 793.179751] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 793.179904] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 793.181011] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-429e2e56-3aa6-4d2e-b9ad-81ec0acad641 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 793.189487] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13ee494a-e312-488d-a895-301fc1da30de {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 793.202868] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b822c1e6-1ad9-4292-91b1-8e538a784279 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 793.208840] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cc05054-af9b-40d0-9372-2585f705288f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 793.236978] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181440MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 793.237118] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 793.237323] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 793.266598] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 793.266762] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 793.279735] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4e84f37-6211-455e-a64d-fe18e68e4529 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 793.287445] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db2f4043-33f5-4bbd-94b8-cc76b164f3ac {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 793.315819] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13272f94-8a4c-457a-8af2-4c5e48cdca7c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 793.322318] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0011b49e-a3f9-4e83-a41d-a1d51706c23b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 793.334714] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 793.342704] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 793.343790] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 793.343957] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.107s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 794.343449] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 794.343819] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 851.169638] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 851.170069] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 851.170069] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 851.178891] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 851.179097] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 852.168805] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 852.169066] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 853.165229] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 853.167847] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 853.177865] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 853.178095] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 853.178263] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 853.178415] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 853.179496] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a2b773f-fb6a-496b-86a3-ac3750a03881 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 853.189023] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-255a214b-10bf-427d-80bc-ef02f10b835b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 853.202167] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c62428f-35b1-43eb-9277-c936c18cc98b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 853.207957] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03fb26a5-f317-4bc4-99f9-c471e7f489ed {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 853.235915] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181450MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 853.236056] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 853.236229] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 853.281663] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 853.281827] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 853.294790] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ce6a76c-cb98-4564-ab47-5629910e4b59 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 853.302106] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-befb1355-abe8-4705-b316-5c7f8289f918 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 853.331613] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-259606fc-3641-439d-90a4-44adff9a9baf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 853.338668] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79d19ee2-6870-4da1-a63f-da8ee6ec9230 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 853.351350] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 853.359561] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 853.360757] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 853.360932] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.125s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 854.362630] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 854.363022] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 855.164659] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 855.175374] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 855.175550] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 910.169569] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 910.169954] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Cleaning up deleted instances {{(pid=62277) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11196}} [ 910.182205] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] There are 0 instances to clean {{(pid=62277) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11205}} [ 910.182365] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 910.183071] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Cleaning up deleted instances with incomplete migration {{(pid=62277) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11234}} [ 910.193536] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 911.203911] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 911.204274] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 911.204274] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 911.212979] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 912.168554] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 913.169181] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 913.169626] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 914.168286] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 914.168538] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 914.178647] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 914.178995] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 914.178995] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 914.179174] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 914.180303] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d266d39-0fba-4b81-8e04-bbbdf1d19f7e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.189509] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19fd2631-11ac-4fc5-b2b1-ce9badfec0aa {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.202904] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b81610b-c55c-44b0-99aa-ee987b96a312 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.208907] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ea65930-e0a2-442f-80e7-61d222c7e561 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.236823] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181447MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 914.236964] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 914.237156] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 914.289783] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 914.289953] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 914.307565] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Refreshing inventories for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 914.320330] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Updating ProviderTree inventory for provider 75e125ea-a599-4b65-b9cd-6ea881735292 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 914.320507] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Updating inventory in ProviderTree for provider 75e125ea-a599-4b65-b9cd-6ea881735292 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 914.332686] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Refreshing aggregate associations for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292, aggregates: None {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 914.348150] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Refreshing trait associations for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 914.360290] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01ba998c-a890-4806-be11-17df540dc0ac {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.368109] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4518306-21bd-4599-8a2f-32f81da4491d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.397533] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae17772c-0c5c-41ad-94c6-707cd0590b70 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.404402] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2dcbb38a-295f-4539-8e30-a0b4997388c3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 914.417069] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 914.425627] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 914.426697] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 914.426870] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.190s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 915.422711] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 915.423166] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 917.168725] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 917.169067] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 972.169732] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 972.170117] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 972.170117] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 972.180288] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 973.168648] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 973.168893] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 975.164227] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 975.168341] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 975.168667] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 975.168748] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 975.185252] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 975.185476] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 975.185655] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 975.185811] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 975.186963] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3be2ce2e-449d-4dee-accc-815afa5976fb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 975.197756] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df3e5d19-6650-46c0-98e4-0e8d920811e8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 975.217227] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b167486-6bce-4b0b-9302-e87116e1e495 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 975.226560] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd9d65d2-ced2-40c4-9af6-f3e28569a3b4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 975.262030] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181454MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 975.262185] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 975.262377] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 975.317569] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 975.317741] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 975.337199] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03f1d399-871a-407d-a6f8-005c8b6315b3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 975.346738] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c081a976-9d81-49d2-8fdd-a889996de4d7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 975.383161] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bc9ca31-9c5e-4446-960d-c1b0d454b602 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 975.391599] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e126cfb-3b94-41f3-b5a6-21b2aefd7c37 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 975.406151] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 975.417284] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 975.418335] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 975.418454] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.156s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 976.414687] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 977.168654] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 979.168310] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 979.168677] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 979.732455] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Acquiring lock "1829c328-2a68-4297-b80e-afb0e898ba72" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 979.732779] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Lock "1829c328-2a68-4297-b80e-afb0e898ba72" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 979.762526] env[62277]: DEBUG nova.compute.manager [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 979.911568] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 979.911568] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 979.912510] env[62277]: INFO nova.compute.claims [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 980.081612] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae6ee7a4-143f-44a5-8436-d2033dfd4ea6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 980.092283] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd94fec9-c3ad-4a2c-8f36-89972d274307 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 980.135575] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6a5c370-f988-4691-aed6-25b015c77eb9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 980.144386] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db2048b6-1394-4d9c-9589-fcdf8ce9b6a5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 980.165954] env[62277]: DEBUG nova.compute.provider_tree [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 980.180653] env[62277]: DEBUG nova.scheduler.client.report [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 980.218192] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.305s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 980.218192] env[62277]: DEBUG nova.compute.manager [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 980.296247] env[62277]: DEBUG nova.compute.utils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 980.297861] env[62277]: DEBUG nova.compute.manager [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 980.298188] env[62277]: DEBUG nova.network.neutron [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 980.332455] env[62277]: DEBUG nova.compute.manager [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 980.428413] env[62277]: DEBUG nova.compute.manager [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 980.653027] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Acquiring lock "6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 980.653027] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Lock "6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 980.678792] env[62277]: DEBUG nova.compute.manager [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 980.767116] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 980.767116] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 980.768215] env[62277]: INFO nova.compute.claims [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 980.893617] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a141e07a-178e-4654-ba62-00c17e20ce1a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 980.906724] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05ceccad-d411-4f3e-93d7-45b3f5d02d9f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 980.950794] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ba764ae-a64d-4c28-9ebe-404ffcdae7dc {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 980.960168] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86a592e8-125d-4f5d-8988-76e232cf97f5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 980.967289] env[62277]: DEBUG nova.virt.hardware [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 980.967539] env[62277]: DEBUG nova.virt.hardware [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 980.967690] env[62277]: DEBUG nova.virt.hardware [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 980.967863] env[62277]: DEBUG nova.virt.hardware [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 980.968011] env[62277]: DEBUG nova.virt.hardware [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 980.968164] env[62277]: DEBUG nova.virt.hardware [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 980.968367] env[62277]: DEBUG nova.virt.hardware [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 980.968516] env[62277]: DEBUG nova.virt.hardware [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 980.968717] env[62277]: DEBUG nova.virt.hardware [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 980.968870] env[62277]: DEBUG nova.virt.hardware [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 980.969048] env[62277]: DEBUG nova.virt.hardware [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 980.971093] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-845cda05-4d96-4675-b446-756f4b4436ea {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 980.996701] env[62277]: DEBUG nova.compute.provider_tree [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 981.006937] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5a96e32-92a2-4022-ad2b-25f89b496bb1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 981.012386] env[62277]: DEBUG nova.scheduler.client.report [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 981.030106] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-045699a9-9451-434f-9522-e6c72a3dc3a5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 981.042071] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.275s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 981.042634] env[62277]: DEBUG nova.compute.manager [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 981.082855] env[62277]: DEBUG nova.compute.utils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 981.086023] env[62277]: DEBUG nova.compute.manager [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Not allocating networking since 'none' was specified. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 981.098200] env[62277]: DEBUG nova.compute.manager [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 981.217867] env[62277]: DEBUG nova.compute.manager [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 981.259562] env[62277]: DEBUG nova.virt.hardware [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 981.259992] env[62277]: DEBUG nova.virt.hardware [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 981.259992] env[62277]: DEBUG nova.virt.hardware [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 981.260112] env[62277]: DEBUG nova.virt.hardware [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 981.260212] env[62277]: DEBUG nova.virt.hardware [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 981.260382] env[62277]: DEBUG nova.virt.hardware [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 981.260547] env[62277]: DEBUG nova.virt.hardware [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 981.260723] env[62277]: DEBUG nova.virt.hardware [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 981.261175] env[62277]: DEBUG nova.virt.hardware [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 981.262751] env[62277]: DEBUG nova.virt.hardware [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 981.262853] env[62277]: DEBUG nova.virt.hardware [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 981.263755] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-395d38df-d95c-416d-b1ab-d7d390269ac6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 981.276499] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2bce8b7-d6fd-49a5-b03d-8685c3c1ee65 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 981.294876] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Instance VIF info [] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 981.305625] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 981.306040] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f9002938-2e51-467b-8012-0987597f92f1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 981.319489] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Created folder: OpenStack in parent group-v4. [ 981.319489] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Creating folder: Project (e448631fabaf4c939ae8ff571c09b03a). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 981.319725] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-faced015-f3dd-4948-b68f-98f956d8dfcc {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 981.335172] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Created folder: Project (e448631fabaf4c939ae8ff571c09b03a) in parent group-v297781. [ 981.337318] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Creating folder: Instances. Parent ref: group-v297782. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 981.337318] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f0a55a21-68dc-4d1a-9581-e6ff2b8da965 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 981.347694] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Created folder: Instances in parent group-v297782. [ 981.347951] env[62277]: DEBUG oslo.service.loopingcall [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 981.348167] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 981.348355] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-65eb3cf0-d4d8-48b3-b909-bb306b0acb97 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 981.371282] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 981.371282] env[62277]: value = "task-1405285" [ 981.371282] env[62277]: _type = "Task" [ 981.371282] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 981.379706] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405285, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 981.557774] env[62277]: DEBUG nova.policy [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2ac6122975bb4260b6b297bc565dacd9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5d4a0a411f594473a08c71fab39d2bf5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 981.886371] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405285, 'name': CreateVM_Task, 'duration_secs': 0.364482} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 981.886371] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 981.887433] env[62277]: DEBUG oslo_vmware.service [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b533053f-56ae-4500-bcac-a94e01e9e88a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 981.893567] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 981.893723] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 981.894404] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 981.894654] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-af3a1a85-87b0-4b6f-9867-67d5e8dd553c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 981.901425] env[62277]: DEBUG oslo_vmware.api [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Waiting for the task: (returnval){ [ 981.901425] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52d5ceac-c7a3-b478-e2fa-b370a23d06ff" [ 981.901425] env[62277]: _type = "Task" [ 981.901425] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 981.913832] env[62277]: DEBUG oslo_vmware.api [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52d5ceac-c7a3-b478-e2fa-b370a23d06ff, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 982.414571] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 982.414864] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 982.416019] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 982.416019] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 982.416019] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 982.416260] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-700ca6c7-a244-4f15-8513-e523ad7768fd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 982.434372] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 982.434494] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 982.435585] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9497cdf4-65b9-4357-881f-0c1505583158 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 982.447863] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-318634c6-0833-4534-ab7c-fa788312a3e7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 982.454991] env[62277]: DEBUG oslo_vmware.api [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Waiting for the task: (returnval){ [ 982.454991] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52c5e632-6931-f02e-2d47-0746d4505c48" [ 982.454991] env[62277]: _type = "Task" [ 982.454991] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 982.466670] env[62277]: DEBUG oslo_vmware.api [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52c5e632-6931-f02e-2d47-0746d4505c48, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 982.971740] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 982.973095] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Creating directory with path [datastore2] vmware_temp/80e86d35-69df-4ee4-b932-2bfa3caa4df1/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 982.973095] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-171d00c5-f86e-418f-9d5f-fc4943027a4c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.003475] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Created directory with path [datastore2] vmware_temp/80e86d35-69df-4ee4-b932-2bfa3caa4df1/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 983.003664] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Fetch image to [datastore2] vmware_temp/80e86d35-69df-4ee4-b932-2bfa3caa4df1/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 983.003822] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/80e86d35-69df-4ee4-b932-2bfa3caa4df1/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 983.007013] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47dc1d71-5e54-4ede-a210-51cbecee917c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.014129] env[62277]: DEBUG nova.network.neutron [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Successfully created port: 1cf98980-9b84-419e-859c-f9eedf9570e0 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 983.016250] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b56ccdd6-164b-4063-8188-e0e94d292858 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.029151] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3be6e899-185b-439f-820b-6a70ea347990 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.064150] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a53f8242-7615-4ff6-8910-f8a6bea86882 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.074277] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-25e6025f-5cc7-4a46-bf8a-b009a2846e3d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.165139] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 983.294894] env[62277]: DEBUG oslo_vmware.rw_handles [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/80e86d35-69df-4ee4-b932-2bfa3caa4df1/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 983.369061] env[62277]: DEBUG oslo_vmware.rw_handles [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 983.369273] env[62277]: DEBUG oslo_vmware.rw_handles [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/80e86d35-69df-4ee4-b932-2bfa3caa4df1/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 983.388710] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquiring lock "3d260cd8-ab21-4e1e-8891-6f216350a587" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 983.389328] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Lock "3d260cd8-ab21-4e1e-8891-6f216350a587" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 983.408658] env[62277]: DEBUG nova.compute.manager [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 983.490227] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 983.490227] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.003s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 983.491939] env[62277]: INFO nova.compute.claims [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 983.636394] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47aec26f-2570-4214-a83d-fc67aa967e3d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.644723] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-039b191d-4dd5-40af-b066-1fcf1ebaeb50 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.678956] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ceed0b7-9eca-4338-9f16-07f2e8b981e3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.693226] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77d61276-a35a-437e-997e-8b8686440d59 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 983.720459] env[62277]: DEBUG nova.compute.provider_tree [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 983.738530] env[62277]: DEBUG nova.scheduler.client.report [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 983.820448] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.330s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 983.824450] env[62277]: DEBUG nova.compute.manager [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 983.869330] env[62277]: DEBUG nova.compute.utils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 983.871514] env[62277]: DEBUG nova.compute.manager [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 983.871810] env[62277]: DEBUG nova.network.neutron [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 983.888389] env[62277]: DEBUG nova.compute.manager [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 984.001888] env[62277]: DEBUG nova.compute.manager [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 984.036048] env[62277]: DEBUG nova.virt.hardware [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 984.036306] env[62277]: DEBUG nova.virt.hardware [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 984.036458] env[62277]: DEBUG nova.virt.hardware [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 984.036630] env[62277]: DEBUG nova.virt.hardware [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 984.036772] env[62277]: DEBUG nova.virt.hardware [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 984.036911] env[62277]: DEBUG nova.virt.hardware [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 984.038176] env[62277]: DEBUG nova.virt.hardware [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 984.038381] env[62277]: DEBUG nova.virt.hardware [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 984.038857] env[62277]: DEBUG nova.virt.hardware [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 984.038857] env[62277]: DEBUG nova.virt.hardware [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 984.038857] env[62277]: DEBUG nova.virt.hardware [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 984.040049] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ee5386c-d649-426e-a9b1-00ff13549573 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 984.051142] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d834672d-fb49-4256-8ebc-081215bfc601 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 984.193146] env[62277]: DEBUG nova.policy [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1f232a197cbc4094aa3b16f3ac856149', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0637edf123a14c9481b07ca6826d6456', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 986.297862] env[62277]: DEBUG nova.network.neutron [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Successfully updated port: 1cf98980-9b84-419e-859c-f9eedf9570e0 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 986.340803] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Acquiring lock "refresh_cache-1829c328-2a68-4297-b80e-afb0e898ba72" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 986.340972] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Acquired lock "refresh_cache-1829c328-2a68-4297-b80e-afb0e898ba72" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 986.341138] env[62277]: DEBUG nova.network.neutron [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 986.459290] env[62277]: DEBUG nova.network.neutron [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Successfully created port: 0bd5121b-3a4f-4aeb-9c5e-57ef44c5f729 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 986.522053] env[62277]: DEBUG nova.network.neutron [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 987.123930] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Acquiring lock "d68ccb50-a04d-4e59-8161-f01305eb81a8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 987.124657] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Lock "d68ccb50-a04d-4e59-8161-f01305eb81a8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 987.142848] env[62277]: DEBUG nova.compute.manager [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 987.225341] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 987.225539] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 987.228531] env[62277]: INFO nova.compute.claims [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 987.982353] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcab26c9-cb7c-41a4-9fc3-03e9a7764f0d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 987.998992] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3d077ca-baad-4ff4-8f5e-3b6536cb3fef {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 988.027645] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4116bf3b-2cfb-4566-994f-f1bc383e1476 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 988.037043] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60cbf148-2944-4035-87b0-92edf288ada8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 988.050840] env[62277]: DEBUG nova.compute.provider_tree [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 988.062387] env[62277]: DEBUG nova.scheduler.client.report [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 988.083713] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.858s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 988.084492] env[62277]: DEBUG nova.compute.manager [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 988.127768] env[62277]: DEBUG nova.compute.utils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 988.128859] env[62277]: DEBUG nova.compute.manager [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 988.131856] env[62277]: DEBUG nova.network.neutron [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 988.144525] env[62277]: DEBUG nova.compute.manager [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 988.275019] env[62277]: DEBUG nova.compute.manager [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 988.304308] env[62277]: DEBUG nova.virt.hardware [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 988.304651] env[62277]: DEBUG nova.virt.hardware [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 988.304732] env[62277]: DEBUG nova.virt.hardware [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 988.305017] env[62277]: DEBUG nova.virt.hardware [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 988.305738] env[62277]: DEBUG nova.virt.hardware [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 988.305738] env[62277]: DEBUG nova.virt.hardware [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 988.305738] env[62277]: DEBUG nova.virt.hardware [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 988.305738] env[62277]: DEBUG nova.virt.hardware [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 988.306028] env[62277]: DEBUG nova.virt.hardware [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 988.306028] env[62277]: DEBUG nova.virt.hardware [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 988.306617] env[62277]: DEBUG nova.virt.hardware [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 988.307035] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46291915-81a3-4ffe-a574-b798be7707c1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 988.315408] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5d5a0c3-36c6-4e56-9242-33071e34a14b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 988.358217] env[62277]: DEBUG nova.compute.manager [req-de70e553-6b2f-43ab-8ee3-948397f0d023 req-20289fff-8419-40a9-8b0c-e1ed5c266122 service nova] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Received event network-vif-plugged-1cf98980-9b84-419e-859c-f9eedf9570e0 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 988.358453] env[62277]: DEBUG oslo_concurrency.lockutils [req-de70e553-6b2f-43ab-8ee3-948397f0d023 req-20289fff-8419-40a9-8b0c-e1ed5c266122 service nova] Acquiring lock "1829c328-2a68-4297-b80e-afb0e898ba72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 988.358665] env[62277]: DEBUG oslo_concurrency.lockutils [req-de70e553-6b2f-43ab-8ee3-948397f0d023 req-20289fff-8419-40a9-8b0c-e1ed5c266122 service nova] Lock "1829c328-2a68-4297-b80e-afb0e898ba72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 988.358838] env[62277]: DEBUG oslo_concurrency.lockutils [req-de70e553-6b2f-43ab-8ee3-948397f0d023 req-20289fff-8419-40a9-8b0c-e1ed5c266122 service nova] Lock "1829c328-2a68-4297-b80e-afb0e898ba72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 988.358997] env[62277]: DEBUG nova.compute.manager [req-de70e553-6b2f-43ab-8ee3-948397f0d023 req-20289fff-8419-40a9-8b0c-e1ed5c266122 service nova] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] No waiting events found dispatching network-vif-plugged-1cf98980-9b84-419e-859c-f9eedf9570e0 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 988.359238] env[62277]: WARNING nova.compute.manager [req-de70e553-6b2f-43ab-8ee3-948397f0d023 req-20289fff-8419-40a9-8b0c-e1ed5c266122 service nova] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Received unexpected event network-vif-plugged-1cf98980-9b84-419e-859c-f9eedf9570e0 for instance with vm_state building and task_state spawning. [ 988.405334] env[62277]: DEBUG nova.policy [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '41c5ce64fe694612bbf97f3bbfba2c8a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1f956b4ec7ac40598a99a6cef7308e72', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 988.513600] env[62277]: DEBUG nova.network.neutron [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Updating instance_info_cache with network_info: [{"id": "1cf98980-9b84-419e-859c-f9eedf9570e0", "address": "fa:16:3e:94:15:2b", "network": {"id": "2c64e00f-4995-4827-b685-5096b1d1d064", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.49", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "63ed6630e9c140baa826f53d7a0564d1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1cf98980-9b", "ovs_interfaceid": "1cf98980-9b84-419e-859c-f9eedf9570e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 988.551387] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Releasing lock "refresh_cache-1829c328-2a68-4297-b80e-afb0e898ba72" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 988.551498] env[62277]: DEBUG nova.compute.manager [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Instance network_info: |[{"id": "1cf98980-9b84-419e-859c-f9eedf9570e0", "address": "fa:16:3e:94:15:2b", "network": {"id": "2c64e00f-4995-4827-b685-5096b1d1d064", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.49", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "63ed6630e9c140baa826f53d7a0564d1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1cf98980-9b", "ovs_interfaceid": "1cf98980-9b84-419e-859c-f9eedf9570e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 988.551948] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:94:15:2b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '77aa121f-8fb6-42f3-aaea-43addfe449b2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1cf98980-9b84-419e-859c-f9eedf9570e0', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 988.561945] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Creating folder: Project (5d4a0a411f594473a08c71fab39d2bf5). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 988.564181] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d45bd231-57b5-4e15-96a2-7ccd53f00ae4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 988.576909] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Created folder: Project (5d4a0a411f594473a08c71fab39d2bf5) in parent group-v297781. [ 988.576909] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Creating folder: Instances. Parent ref: group-v297785. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 988.577299] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bcfc96c1-14c4-4811-8dae-3b7cffc1da6e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 988.589486] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Created folder: Instances in parent group-v297785. [ 988.589857] env[62277]: DEBUG oslo.service.loopingcall [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 988.590201] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 988.592208] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a53bb744-5b2a-47ed-96f7-d19c457e8645 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 988.621277] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 988.621277] env[62277]: value = "task-1405288" [ 988.621277] env[62277]: _type = "Task" [ 988.621277] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 988.628987] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405288, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 989.133510] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405288, 'name': CreateVM_Task, 'duration_secs': 0.468833} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 989.133510] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 989.183110] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 989.183110] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 989.183110] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 989.183110] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0bc365f0-6b73-46ce-a9d5-3bf8bf4af61f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 989.183764] env[62277]: DEBUG oslo_vmware.api [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Waiting for the task: (returnval){ [ 989.183764] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]529de60c-ab5e-9e1d-1467-6a477a756686" [ 989.183764] env[62277]: _type = "Task" [ 989.183764] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 989.197762] env[62277]: DEBUG oslo_vmware.api [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]529de60c-ab5e-9e1d-1467-6a477a756686, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 989.696639] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 989.697035] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 989.699366] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 990.766290] env[62277]: DEBUG nova.network.neutron [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Successfully created port: 45d69044-2875-4a53-9b40-c309ba6ee1bb {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 990.795191] env[62277]: DEBUG nova.network.neutron [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Successfully updated port: 0bd5121b-3a4f-4aeb-9c5e-57ef44c5f729 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 990.815578] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquiring lock "refresh_cache-3d260cd8-ab21-4e1e-8891-6f216350a587" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 990.815910] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquired lock "refresh_cache-3d260cd8-ab21-4e1e-8891-6f216350a587" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 990.815910] env[62277]: DEBUG nova.network.neutron [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 991.052301] env[62277]: DEBUG nova.network.neutron [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 991.382024] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Acquiring lock "19d15611-315f-4c4b-8f32-e5d00d0d8ca8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 991.382852] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Lock "19d15611-315f-4c4b-8f32-e5d00d0d8ca8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 991.396253] env[62277]: DEBUG nova.compute.manager [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 991.484194] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 991.484765] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 991.487199] env[62277]: INFO nova.compute.claims [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 991.648689] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aed075d6-f862-4d05-963d-a81bb75e0aa3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 991.656574] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0c9e55c-8f48-4150-a2ff-71be064e596e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 991.690016] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7897842a-c6ac-40fc-92da-29f268789054 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 991.697902] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40597d92-f103-42f7-ade0-9394212ac4cf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 991.711368] env[62277]: DEBUG nova.compute.provider_tree [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 991.722781] env[62277]: DEBUG nova.scheduler.client.report [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 991.748392] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 991.750111] env[62277]: DEBUG nova.compute.manager [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 991.798805] env[62277]: DEBUG nova.compute.utils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 991.800534] env[62277]: DEBUG nova.compute.manager [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 991.800671] env[62277]: DEBUG nova.network.neutron [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 991.815882] env[62277]: DEBUG nova.compute.manager [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 991.911740] env[62277]: DEBUG nova.compute.manager [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 991.968605] env[62277]: DEBUG nova.virt.hardware [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 991.969037] env[62277]: DEBUG nova.virt.hardware [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 991.969302] env[62277]: DEBUG nova.virt.hardware [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 991.969596] env[62277]: DEBUG nova.virt.hardware [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 991.969848] env[62277]: DEBUG nova.virt.hardware [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 991.970103] env[62277]: DEBUG nova.virt.hardware [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 991.970422] env[62277]: DEBUG nova.virt.hardware [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 991.970677] env[62277]: DEBUG nova.virt.hardware [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 991.972811] env[62277]: DEBUG nova.virt.hardware [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 991.972811] env[62277]: DEBUG nova.virt.hardware [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 991.972811] env[62277]: DEBUG nova.virt.hardware [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 991.972811] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77016b02-f46c-49e3-b1dc-2a03e6b2faa7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 991.984421] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78be8e88-665e-4b51-b0c5-df160e94a363 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 992.037640] env[62277]: DEBUG nova.policy [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '36a166823b854df697f4b886464c5114', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '29dbbf00030a407986371193ea850423', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 992.219538] env[62277]: DEBUG nova.network.neutron [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Updating instance_info_cache with network_info: [{"id": "0bd5121b-3a4f-4aeb-9c5e-57ef44c5f729", "address": "fa:16:3e:91:de:e9", "network": {"id": "2c64e00f-4995-4827-b685-5096b1d1d064", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "63ed6630e9c140baa826f53d7a0564d1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0bd5121b-3a", "ovs_interfaceid": "0bd5121b-3a4f-4aeb-9c5e-57ef44c5f729", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 992.236018] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Releasing lock "refresh_cache-3d260cd8-ab21-4e1e-8891-6f216350a587" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 992.236382] env[62277]: DEBUG nova.compute.manager [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Instance network_info: |[{"id": "0bd5121b-3a4f-4aeb-9c5e-57ef44c5f729", "address": "fa:16:3e:91:de:e9", "network": {"id": "2c64e00f-4995-4827-b685-5096b1d1d064", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "63ed6630e9c140baa826f53d7a0564d1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0bd5121b-3a", "ovs_interfaceid": "0bd5121b-3a4f-4aeb-9c5e-57ef44c5f729", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 992.240239] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:91:de:e9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '77aa121f-8fb6-42f3-aaea-43addfe449b2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0bd5121b-3a4f-4aeb-9c5e-57ef44c5f729', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 992.252039] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Creating folder: Project (0637edf123a14c9481b07ca6826d6456). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 992.253036] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4a428c32-5167-4c27-bd0c-c2fc5bbebc4a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 992.267990] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Created folder: Project (0637edf123a14c9481b07ca6826d6456) in parent group-v297781. [ 992.268309] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Creating folder: Instances. Parent ref: group-v297788. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 992.268975] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-612731d7-25fc-4728-a71f-a3820b6d9ad1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 992.282270] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Created folder: Instances in parent group-v297788. [ 992.282270] env[62277]: DEBUG oslo.service.loopingcall [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 992.282487] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 992.282627] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f04f6e4a-24a9-4a5f-b57c-7740608ee18d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 992.307178] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 992.307178] env[62277]: value = "task-1405291" [ 992.307178] env[62277]: _type = "Task" [ 992.307178] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 992.322996] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405291, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 992.436255] env[62277]: DEBUG nova.compute.manager [req-59e8a317-8394-4fda-b708-feaae7ef4a72 req-6a44c4bd-3ec2-4879-b141-2528b47f9c4d service nova] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Received event network-vif-plugged-0bd5121b-3a4f-4aeb-9c5e-57ef44c5f729 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 992.436359] env[62277]: DEBUG oslo_concurrency.lockutils [req-59e8a317-8394-4fda-b708-feaae7ef4a72 req-6a44c4bd-3ec2-4879-b141-2528b47f9c4d service nova] Acquiring lock "3d260cd8-ab21-4e1e-8891-6f216350a587-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 992.436477] env[62277]: DEBUG oslo_concurrency.lockutils [req-59e8a317-8394-4fda-b708-feaae7ef4a72 req-6a44c4bd-3ec2-4879-b141-2528b47f9c4d service nova] Lock "3d260cd8-ab21-4e1e-8891-6f216350a587-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 992.436642] env[62277]: DEBUG oslo_concurrency.lockutils [req-59e8a317-8394-4fda-b708-feaae7ef4a72 req-6a44c4bd-3ec2-4879-b141-2528b47f9c4d service nova] Lock "3d260cd8-ab21-4e1e-8891-6f216350a587-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 992.436803] env[62277]: DEBUG nova.compute.manager [req-59e8a317-8394-4fda-b708-feaae7ef4a72 req-6a44c4bd-3ec2-4879-b141-2528b47f9c4d service nova] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] No waiting events found dispatching network-vif-plugged-0bd5121b-3a4f-4aeb-9c5e-57ef44c5f729 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 992.436960] env[62277]: WARNING nova.compute.manager [req-59e8a317-8394-4fda-b708-feaae7ef4a72 req-6a44c4bd-3ec2-4879-b141-2528b47f9c4d service nova] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Received unexpected event network-vif-plugged-0bd5121b-3a4f-4aeb-9c5e-57ef44c5f729 for instance with vm_state building and task_state spawning. [ 992.580603] env[62277]: DEBUG nova.compute.manager [req-3d59584c-35c6-46f5-9eee-4c4be3fef8f4 req-27fda3a4-cd49-4da9-ac26-7fe2ca34da1d service nova] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Received event network-changed-1cf98980-9b84-419e-859c-f9eedf9570e0 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 992.580830] env[62277]: DEBUG nova.compute.manager [req-3d59584c-35c6-46f5-9eee-4c4be3fef8f4 req-27fda3a4-cd49-4da9-ac26-7fe2ca34da1d service nova] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Refreshing instance network info cache due to event network-changed-1cf98980-9b84-419e-859c-f9eedf9570e0. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 992.581089] env[62277]: DEBUG oslo_concurrency.lockutils [req-3d59584c-35c6-46f5-9eee-4c4be3fef8f4 req-27fda3a4-cd49-4da9-ac26-7fe2ca34da1d service nova] Acquiring lock "refresh_cache-1829c328-2a68-4297-b80e-afb0e898ba72" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 992.581252] env[62277]: DEBUG oslo_concurrency.lockutils [req-3d59584c-35c6-46f5-9eee-4c4be3fef8f4 req-27fda3a4-cd49-4da9-ac26-7fe2ca34da1d service nova] Acquired lock "refresh_cache-1829c328-2a68-4297-b80e-afb0e898ba72" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 992.581414] env[62277]: DEBUG nova.network.neutron [req-3d59584c-35c6-46f5-9eee-4c4be3fef8f4 req-27fda3a4-cd49-4da9-ac26-7fe2ca34da1d service nova] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Refreshing network info cache for port 1cf98980-9b84-419e-859c-f9eedf9570e0 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 992.818060] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405291, 'name': CreateVM_Task, 'duration_secs': 0.425054} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 992.818584] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 992.819416] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 992.819712] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 992.820153] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 992.820510] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4077d265-88b6-46b2-81d3-632de6ddd7a7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 992.825416] env[62277]: DEBUG oslo_vmware.api [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Waiting for the task: (returnval){ [ 992.825416] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52580de8-ea79-4b4e-5c3d-a36a1ff8f00e" [ 992.825416] env[62277]: _type = "Task" [ 992.825416] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 992.833561] env[62277]: DEBUG oslo_vmware.api [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52580de8-ea79-4b4e-5c3d-a36a1ff8f00e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 993.344342] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 993.344342] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 993.344342] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 993.862933] env[62277]: DEBUG nova.network.neutron [req-3d59584c-35c6-46f5-9eee-4c4be3fef8f4 req-27fda3a4-cd49-4da9-ac26-7fe2ca34da1d service nova] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Updated VIF entry in instance network info cache for port 1cf98980-9b84-419e-859c-f9eedf9570e0. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 993.862933] env[62277]: DEBUG nova.network.neutron [req-3d59584c-35c6-46f5-9eee-4c4be3fef8f4 req-27fda3a4-cd49-4da9-ac26-7fe2ca34da1d service nova] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Updating instance_info_cache with network_info: [{"id": "1cf98980-9b84-419e-859c-f9eedf9570e0", "address": "fa:16:3e:94:15:2b", "network": {"id": "2c64e00f-4995-4827-b685-5096b1d1d064", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.49", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "63ed6630e9c140baa826f53d7a0564d1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1cf98980-9b", "ovs_interfaceid": "1cf98980-9b84-419e-859c-f9eedf9570e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 993.873912] env[62277]: DEBUG nova.network.neutron [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Successfully created port: c460565a-65c2-4cca-8ffa-2d386aa65882 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 993.881505] env[62277]: DEBUG oslo_concurrency.lockutils [req-3d59584c-35c6-46f5-9eee-4c4be3fef8f4 req-27fda3a4-cd49-4da9-ac26-7fe2ca34da1d service nova] Releasing lock "refresh_cache-1829c328-2a68-4297-b80e-afb0e898ba72" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 994.275127] env[62277]: DEBUG nova.network.neutron [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Successfully updated port: 45d69044-2875-4a53-9b40-c309ba6ee1bb {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 994.288302] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Acquiring lock "refresh_cache-d68ccb50-a04d-4e59-8161-f01305eb81a8" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 994.288302] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Acquired lock "refresh_cache-d68ccb50-a04d-4e59-8161-f01305eb81a8" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 994.288302] env[62277]: DEBUG nova.network.neutron [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 994.408618] env[62277]: DEBUG nova.network.neutron [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 994.820881] env[62277]: DEBUG nova.network.neutron [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Updating instance_info_cache with network_info: [{"id": "45d69044-2875-4a53-9b40-c309ba6ee1bb", "address": "fa:16:3e:d4:8a:ec", "network": {"id": "9659686e-adcf-449c-b77c-fde92b6002af", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-132762464-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1f956b4ec7ac40598a99a6cef7308e72", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "75ff81f9-72b2-4e58-a8d8-5699907f7459", "external-id": "nsx-vlan-transportzone-978", "segmentation_id": 978, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap45d69044-28", "ovs_interfaceid": "45d69044-2875-4a53-9b40-c309ba6ee1bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 994.840794] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Releasing lock "refresh_cache-d68ccb50-a04d-4e59-8161-f01305eb81a8" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 994.841176] env[62277]: DEBUG nova.compute.manager [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Instance network_info: |[{"id": "45d69044-2875-4a53-9b40-c309ba6ee1bb", "address": "fa:16:3e:d4:8a:ec", "network": {"id": "9659686e-adcf-449c-b77c-fde92b6002af", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-132762464-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1f956b4ec7ac40598a99a6cef7308e72", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "75ff81f9-72b2-4e58-a8d8-5699907f7459", "external-id": "nsx-vlan-transportzone-978", "segmentation_id": 978, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap45d69044-28", "ovs_interfaceid": "45d69044-2875-4a53-9b40-c309ba6ee1bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 994.842118] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d4:8a:ec', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '75ff81f9-72b2-4e58-a8d8-5699907f7459', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '45d69044-2875-4a53-9b40-c309ba6ee1bb', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 994.849763] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Creating folder: Project (1f956b4ec7ac40598a99a6cef7308e72). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 994.850320] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1384515a-3007-49e1-b466-457ccb27215c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 994.864019] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Created folder: Project (1f956b4ec7ac40598a99a6cef7308e72) in parent group-v297781. [ 994.864019] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Creating folder: Instances. Parent ref: group-v297791. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 994.864019] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c21f8e41-5f59-44fb-9ad4-8c6cf96dfbd8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 994.874060] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Created folder: Instances in parent group-v297791. [ 994.876015] env[62277]: DEBUG oslo.service.loopingcall [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 994.876015] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 994.876015] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-89b9c71c-8636-4bba-a06c-c13aa541e6b7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 994.898790] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 994.898790] env[62277]: value = "task-1405294" [ 994.898790] env[62277]: _type = "Task" [ 994.898790] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 994.907948] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405294, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 995.147380] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Acquiring lock "dfc291fd-1481-4e76-9fb3-ec87124c1281" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 995.147666] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Lock "dfc291fd-1481-4e76-9fb3-ec87124c1281" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 995.175642] env[62277]: DEBUG nova.compute.manager [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 995.273901] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 995.276073] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 995.276073] env[62277]: INFO nova.compute.claims [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 995.415880] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405294, 'name': CreateVM_Task} progress is 99%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 995.532739] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4aa0373-c7ab-41a4-95b0-03dfd4fed653 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 995.546739] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9f58940-afd7-4083-91cb-ab291020f1d0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 995.587432] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c22cf4ef-e998-46ae-bdf3-9a4b33d0f8d2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 995.596060] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d8712ff-1a95-4136-a4c5-af4bdfdfe2ce {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 995.616228] env[62277]: DEBUG nova.compute.provider_tree [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 995.636952] env[62277]: DEBUG nova.scheduler.client.report [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 995.660868] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.385s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 995.660868] env[62277]: DEBUG nova.compute.manager [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 995.714752] env[62277]: DEBUG nova.compute.utils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 995.716101] env[62277]: DEBUG nova.compute.manager [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 995.716262] env[62277]: DEBUG nova.network.neutron [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 995.736724] env[62277]: DEBUG nova.compute.manager [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 995.846465] env[62277]: DEBUG nova.compute.manager [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 995.865977] env[62277]: DEBUG nova.policy [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'df47e09c9543414aa07181d1facd8cd1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '37c92b5660e84218890a498bbe1519b5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 995.881641] env[62277]: DEBUG nova.virt.hardware [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 995.881904] env[62277]: DEBUG nova.virt.hardware [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 995.882075] env[62277]: DEBUG nova.virt.hardware [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 995.882255] env[62277]: DEBUG nova.virt.hardware [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 995.882393] env[62277]: DEBUG nova.virt.hardware [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 995.882537] env[62277]: DEBUG nova.virt.hardware [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 995.882733] env[62277]: DEBUG nova.virt.hardware [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 995.883388] env[62277]: DEBUG nova.virt.hardware [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 995.883388] env[62277]: DEBUG nova.virt.hardware [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 995.884903] env[62277]: DEBUG nova.virt.hardware [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 995.884903] env[62277]: DEBUG nova.virt.hardware [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 995.885363] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46141a8d-e41b-4f8c-ae78-5d75513c09fd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 995.899093] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfe4a279-68e0-45f8-92e5-2761a9c2bccd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 995.919101] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405294, 'name': CreateVM_Task} progress is 99%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 996.382324] env[62277]: DEBUG nova.network.neutron [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Successfully updated port: c460565a-65c2-4cca-8ffa-2d386aa65882 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 996.398832] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Acquiring lock "refresh_cache-19d15611-315f-4c4b-8f32-e5d00d0d8ca8" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 996.398979] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Acquired lock "refresh_cache-19d15611-315f-4c4b-8f32-e5d00d0d8ca8" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 996.399142] env[62277]: DEBUG nova.network.neutron [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 996.418697] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405294, 'name': CreateVM_Task, 'duration_secs': 1.325369} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 996.419385] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 996.420371] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 996.420371] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 996.420455] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 996.420938] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9f5dd276-3ebf-464c-93ea-4c015aff3b29 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 996.427811] env[62277]: DEBUG oslo_vmware.api [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Waiting for the task: (returnval){ [ 996.427811] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52b20741-83f5-466c-134f-e2cd7009ff15" [ 996.427811] env[62277]: _type = "Task" [ 996.427811] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 996.439769] env[62277]: DEBUG oslo_vmware.api [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52b20741-83f5-466c-134f-e2cd7009ff15, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 996.519634] env[62277]: DEBUG nova.network.neutron [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 996.939838] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 996.940130] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 996.940349] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 996.975720] env[62277]: DEBUG nova.network.neutron [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Updating instance_info_cache with network_info: [{"id": "c460565a-65c2-4cca-8ffa-2d386aa65882", "address": "fa:16:3e:75:52:26", "network": {"id": "eda6b8aa-c2b9-4c83-bc9a-09322648796f", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1860511403-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "29dbbf00030a407986371193ea850423", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "30c39e9a-a798-4f25-a48c-91f786ba332c", "external-id": "nsx-vlan-transportzone-438", "segmentation_id": 438, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc460565a-65", "ovs_interfaceid": "c460565a-65c2-4cca-8ffa-2d386aa65882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 996.990157] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Releasing lock "refresh_cache-19d15611-315f-4c4b-8f32-e5d00d0d8ca8" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 996.992134] env[62277]: DEBUG nova.compute.manager [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Instance network_info: |[{"id": "c460565a-65c2-4cca-8ffa-2d386aa65882", "address": "fa:16:3e:75:52:26", "network": {"id": "eda6b8aa-c2b9-4c83-bc9a-09322648796f", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1860511403-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "29dbbf00030a407986371193ea850423", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "30c39e9a-a798-4f25-a48c-91f786ba332c", "external-id": "nsx-vlan-transportzone-438", "segmentation_id": 438, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc460565a-65", "ovs_interfaceid": "c460565a-65c2-4cca-8ffa-2d386aa65882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 996.992239] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:75:52:26', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '30c39e9a-a798-4f25-a48c-91f786ba332c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c460565a-65c2-4cca-8ffa-2d386aa65882', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 997.000399] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Creating folder: Project (29dbbf00030a407986371193ea850423). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 997.000997] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-388b9683-86d3-4f93-97a8-cd24f312ff6d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.014782] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Created folder: Project (29dbbf00030a407986371193ea850423) in parent group-v297781. [ 997.015470] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Creating folder: Instances. Parent ref: group-v297794. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 997.016075] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-76bb9192-bcc3-4264-952b-767cac455dc0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.027221] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Created folder: Instances in parent group-v297794. [ 997.027477] env[62277]: DEBUG oslo.service.loopingcall [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 997.027673] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 997.028023] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0d33940f-5781-49ea-8188-75d2e46c0cbb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.050493] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 997.050493] env[62277]: value = "task-1405297" [ 997.050493] env[62277]: _type = "Task" [ 997.050493] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 997.059685] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405297, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 997.451706] env[62277]: DEBUG nova.network.neutron [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Successfully created port: 1ed02afb-746f-473b-828a-aae94ab6258f {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 997.562204] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405297, 'name': CreateVM_Task, 'duration_secs': 0.306265} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 997.562472] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 997.563677] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 997.563677] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 997.564021] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 997.564283] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-45ff686a-03d6-4eed-a72d-2369418df03c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.570935] env[62277]: DEBUG oslo_vmware.api [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Waiting for the task: (returnval){ [ 997.570935] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52875189-ae25-6104-03cd-f9d8e8ae4c64" [ 997.570935] env[62277]: _type = "Task" [ 997.570935] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 997.580626] env[62277]: DEBUG oslo_vmware.api [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52875189-ae25-6104-03cd-f9d8e8ae4c64, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 998.082876] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 998.083201] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 998.083479] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 998.421644] env[62277]: DEBUG nova.compute.manager [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Received event network-changed-0bd5121b-3a4f-4aeb-9c5e-57ef44c5f729 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 998.423774] env[62277]: DEBUG nova.compute.manager [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Refreshing instance network info cache due to event network-changed-0bd5121b-3a4f-4aeb-9c5e-57ef44c5f729. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 998.423774] env[62277]: DEBUG oslo_concurrency.lockutils [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] Acquiring lock "refresh_cache-3d260cd8-ab21-4e1e-8891-6f216350a587" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 998.423774] env[62277]: DEBUG oslo_concurrency.lockutils [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] Acquired lock "refresh_cache-3d260cd8-ab21-4e1e-8891-6f216350a587" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 998.423774] env[62277]: DEBUG nova.network.neutron [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Refreshing network info cache for port 0bd5121b-3a4f-4aeb-9c5e-57ef44c5f729 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 999.271504] env[62277]: DEBUG nova.compute.manager [req-114e2605-ebbe-423f-9c4e-c2337c3fcaf1 req-6e8b0e77-e123-4837-821f-71a5808ca25c service nova] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Received event network-vif-plugged-c460565a-65c2-4cca-8ffa-2d386aa65882 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 999.271740] env[62277]: DEBUG oslo_concurrency.lockutils [req-114e2605-ebbe-423f-9c4e-c2337c3fcaf1 req-6e8b0e77-e123-4837-821f-71a5808ca25c service nova] Acquiring lock "19d15611-315f-4c4b-8f32-e5d00d0d8ca8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 999.271923] env[62277]: DEBUG oslo_concurrency.lockutils [req-114e2605-ebbe-423f-9c4e-c2337c3fcaf1 req-6e8b0e77-e123-4837-821f-71a5808ca25c service nova] Lock "19d15611-315f-4c4b-8f32-e5d00d0d8ca8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 999.273471] env[62277]: DEBUG oslo_concurrency.lockutils [req-114e2605-ebbe-423f-9c4e-c2337c3fcaf1 req-6e8b0e77-e123-4837-821f-71a5808ca25c service nova] Lock "19d15611-315f-4c4b-8f32-e5d00d0d8ca8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 999.274731] env[62277]: DEBUG nova.compute.manager [req-114e2605-ebbe-423f-9c4e-c2337c3fcaf1 req-6e8b0e77-e123-4837-821f-71a5808ca25c service nova] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] No waiting events found dispatching network-vif-plugged-c460565a-65c2-4cca-8ffa-2d386aa65882 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 999.274780] env[62277]: WARNING nova.compute.manager [req-114e2605-ebbe-423f-9c4e-c2337c3fcaf1 req-6e8b0e77-e123-4837-821f-71a5808ca25c service nova] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Received unexpected event network-vif-plugged-c460565a-65c2-4cca-8ffa-2d386aa65882 for instance with vm_state building and task_state spawning. [ 999.604157] env[62277]: DEBUG nova.network.neutron [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Updated VIF entry in instance network info cache for port 0bd5121b-3a4f-4aeb-9c5e-57ef44c5f729. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 999.604458] env[62277]: DEBUG nova.network.neutron [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Updating instance_info_cache with network_info: [{"id": "0bd5121b-3a4f-4aeb-9c5e-57ef44c5f729", "address": "fa:16:3e:91:de:e9", "network": {"id": "2c64e00f-4995-4827-b685-5096b1d1d064", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "63ed6630e9c140baa826f53d7a0564d1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0bd5121b-3a", "ovs_interfaceid": "0bd5121b-3a4f-4aeb-9c5e-57ef44c5f729", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 999.615283] env[62277]: DEBUG oslo_concurrency.lockutils [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] Releasing lock "refresh_cache-3d260cd8-ab21-4e1e-8891-6f216350a587" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 999.615533] env[62277]: DEBUG nova.compute.manager [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Received event network-vif-plugged-45d69044-2875-4a53-9b40-c309ba6ee1bb {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 999.615724] env[62277]: DEBUG oslo_concurrency.lockutils [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] Acquiring lock "d68ccb50-a04d-4e59-8161-f01305eb81a8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 999.619020] env[62277]: DEBUG oslo_concurrency.lockutils [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] Lock "d68ccb50-a04d-4e59-8161-f01305eb81a8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 999.619020] env[62277]: DEBUG oslo_concurrency.lockutils [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] Lock "d68ccb50-a04d-4e59-8161-f01305eb81a8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 999.619020] env[62277]: DEBUG nova.compute.manager [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] No waiting events found dispatching network-vif-plugged-45d69044-2875-4a53-9b40-c309ba6ee1bb {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 999.619020] env[62277]: WARNING nova.compute.manager [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Received unexpected event network-vif-plugged-45d69044-2875-4a53-9b40-c309ba6ee1bb for instance with vm_state building and task_state spawning. [ 999.619299] env[62277]: DEBUG nova.compute.manager [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Received event network-changed-45d69044-2875-4a53-9b40-c309ba6ee1bb {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 999.619299] env[62277]: DEBUG nova.compute.manager [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Refreshing instance network info cache due to event network-changed-45d69044-2875-4a53-9b40-c309ba6ee1bb. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 999.619299] env[62277]: DEBUG oslo_concurrency.lockutils [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] Acquiring lock "refresh_cache-d68ccb50-a04d-4e59-8161-f01305eb81a8" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 999.619299] env[62277]: DEBUG oslo_concurrency.lockutils [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] Acquired lock "refresh_cache-d68ccb50-a04d-4e59-8161-f01305eb81a8" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 999.619299] env[62277]: DEBUG nova.network.neutron [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Refreshing network info cache for port 45d69044-2875-4a53-9b40-c309ba6ee1bb {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1000.263783] env[62277]: DEBUG nova.network.neutron [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Successfully updated port: 1ed02afb-746f-473b-828a-aae94ab6258f {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1000.281490] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Acquiring lock "refresh_cache-dfc291fd-1481-4e76-9fb3-ec87124c1281" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1000.281742] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Acquired lock "refresh_cache-dfc291fd-1481-4e76-9fb3-ec87124c1281" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1000.281792] env[62277]: DEBUG nova.network.neutron [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1000.395253] env[62277]: DEBUG nova.network.neutron [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1000.853473] env[62277]: DEBUG nova.network.neutron [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Updated VIF entry in instance network info cache for port 45d69044-2875-4a53-9b40-c309ba6ee1bb. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1000.853865] env[62277]: DEBUG nova.network.neutron [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Updating instance_info_cache with network_info: [{"id": "45d69044-2875-4a53-9b40-c309ba6ee1bb", "address": "fa:16:3e:d4:8a:ec", "network": {"id": "9659686e-adcf-449c-b77c-fde92b6002af", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-132762464-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1f956b4ec7ac40598a99a6cef7308e72", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "75ff81f9-72b2-4e58-a8d8-5699907f7459", "external-id": "nsx-vlan-transportzone-978", "segmentation_id": 978, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap45d69044-28", "ovs_interfaceid": "45d69044-2875-4a53-9b40-c309ba6ee1bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1000.869941] env[62277]: DEBUG oslo_concurrency.lockutils [req-423e8a79-7761-4619-9036-a08784f19019 req-0a9bcac3-6823-4056-bff0-30930bf806e3 service nova] Releasing lock "refresh_cache-d68ccb50-a04d-4e59-8161-f01305eb81a8" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1001.146683] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquiring lock "36ff1435-1999-4e95-8920-81a1b25cc452" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1001.147165] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Lock "36ff1435-1999-4e95-8920-81a1b25cc452" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1001.161254] env[62277]: DEBUG nova.compute.manager [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1001.261051] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1001.261322] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1001.262912] env[62277]: INFO nova.compute.claims [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1001.342685] env[62277]: DEBUG nova.network.neutron [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Updating instance_info_cache with network_info: [{"id": "1ed02afb-746f-473b-828a-aae94ab6258f", "address": "fa:16:3e:ba:1f:b3", "network": {"id": "712edd31-d6ed-48d2-989d-d16b2d30d012", "bridge": "br-int", "label": "tempest-ServersTestJSON-912307607-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "37c92b5660e84218890a498bbe1519b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8b29df12-5674-476d-a9e5-5e20f704d224", "external-id": "nsx-vlan-transportzone-754", "segmentation_id": 754, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ed02afb-74", "ovs_interfaceid": "1ed02afb-746f-473b-828a-aae94ab6258f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1001.370262] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Releasing lock "refresh_cache-dfc291fd-1481-4e76-9fb3-ec87124c1281" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1001.374488] env[62277]: DEBUG nova.compute.manager [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Instance network_info: |[{"id": "1ed02afb-746f-473b-828a-aae94ab6258f", "address": "fa:16:3e:ba:1f:b3", "network": {"id": "712edd31-d6ed-48d2-989d-d16b2d30d012", "bridge": "br-int", "label": "tempest-ServersTestJSON-912307607-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "37c92b5660e84218890a498bbe1519b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8b29df12-5674-476d-a9e5-5e20f704d224", "external-id": "nsx-vlan-transportzone-754", "segmentation_id": 754, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ed02afb-74", "ovs_interfaceid": "1ed02afb-746f-473b-828a-aae94ab6258f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1001.374643] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ba:1f:b3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8b29df12-5674-476d-a9e5-5e20f704d224', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1ed02afb-746f-473b-828a-aae94ab6258f', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1001.387488] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Creating folder: Project (37c92b5660e84218890a498bbe1519b5). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1001.393298] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-affa7b85-0c94-42d4-abfb-b59336b9f199 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.410525] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Created folder: Project (37c92b5660e84218890a498bbe1519b5) in parent group-v297781. [ 1001.410703] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Creating folder: Instances. Parent ref: group-v297797. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1001.411242] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-620b5e0f-b459-4b4a-9159-5b812c0e90eb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.422446] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Created folder: Instances in parent group-v297797. [ 1001.422889] env[62277]: DEBUG oslo.service.loopingcall [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1001.423209] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1001.423625] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cd0c89d4-4e8f-4121-89bb-4a8367ffc4f2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.450579] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1001.450579] env[62277]: value = "task-1405300" [ 1001.450579] env[62277]: _type = "Task" [ 1001.450579] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1001.459600] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405300, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1001.556885] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca5e596a-7ad4-45da-afc0-808dbe7d3a50 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.578455] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-697456c1-b74d-4d7c-8daa-d28c3dbe8941 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.619052] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d5a3179-7f2b-42e8-905d-c76742184c98 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.627133] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9711d11-9c70-4532-93b5-bf6c3cd47770 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.641395] env[62277]: DEBUG nova.compute.provider_tree [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1001.655770] env[62277]: DEBUG nova.scheduler.client.report [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1001.678720] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.417s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1001.679560] env[62277]: DEBUG nova.compute.manager [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1001.736758] env[62277]: DEBUG nova.compute.utils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1001.740029] env[62277]: DEBUG nova.compute.manager [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1001.740321] env[62277]: DEBUG nova.network.neutron [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1001.760725] env[62277]: DEBUG nova.compute.manager [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1001.859129] env[62277]: DEBUG nova.compute.manager [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1001.891887] env[62277]: DEBUG nova.virt.hardware [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1001.892144] env[62277]: DEBUG nova.virt.hardware [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1001.892495] env[62277]: DEBUG nova.virt.hardware [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1001.892693] env[62277]: DEBUG nova.virt.hardware [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1001.892833] env[62277]: DEBUG nova.virt.hardware [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1001.892974] env[62277]: DEBUG nova.virt.hardware [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1001.893196] env[62277]: DEBUG nova.virt.hardware [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1001.893349] env[62277]: DEBUG nova.virt.hardware [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1001.893504] env[62277]: DEBUG nova.virt.hardware [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1001.893667] env[62277]: DEBUG nova.virt.hardware [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1001.893812] env[62277]: DEBUG nova.virt.hardware [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1001.894700] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5ce7bc1-5f90-4d5a-8c46-c78dc6aee474 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.904201] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa1b4abe-151f-4e8d-9eb9-6fba52d44c3f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.927013] env[62277]: DEBUG nova.policy [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0ec8b5ed58474c4b874c1231ac8c92e1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f56098c52907445bb4675268403fe9f6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1001.960702] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405300, 'name': CreateVM_Task, 'duration_secs': 0.359061} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1001.960923] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1001.961588] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1001.961736] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1001.962055] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1001.962328] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-711d17da-5439-43fd-85fb-a0c64828644c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.967301] env[62277]: DEBUG oslo_vmware.api [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Waiting for the task: (returnval){ [ 1001.967301] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52ee5ffa-fe74-2e76-f0fc-a9d177752063" [ 1001.967301] env[62277]: _type = "Task" [ 1001.967301] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1001.976185] env[62277]: DEBUG oslo_vmware.api [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52ee5ffa-fe74-2e76-f0fc-a9d177752063, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1002.460235] env[62277]: DEBUG nova.network.neutron [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Successfully created port: 7d48e1c6-05b6-4a29-b3f3-b901df38433f {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1002.479792] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1002.480083] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1002.481017] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1002.489626] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Acquiring lock "68925f1b-da69-4955-acb1-d6500b03daee" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1002.491457] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Lock "68925f1b-da69-4955-acb1-d6500b03daee" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1002.508434] env[62277]: DEBUG nova.compute.manager [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1002.569033] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1002.569033] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1002.570472] env[62277]: INFO nova.compute.claims [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1002.781721] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4faca8a-c49f-485b-87f2-796872d4a900 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1002.789734] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c70fd5b2-cebb-430a-aa91-2f16fd2d8541 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1002.827600] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5d93cab-3ae3-482b-8d0c-62fa79b6caf3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1002.832637] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90182a1b-9553-430a-84c6-e53de6720d34 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1002.847898] env[62277]: DEBUG nova.compute.provider_tree [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1002.857947] env[62277]: DEBUG nova.scheduler.client.report [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1002.885415] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.316s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1002.885415] env[62277]: DEBUG nova.compute.manager [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1002.930575] env[62277]: DEBUG nova.compute.utils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1002.932189] env[62277]: DEBUG nova.compute.manager [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Not allocating networking since 'none' was specified. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1002.948163] env[62277]: DEBUG nova.compute.manager [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1003.033589] env[62277]: DEBUG nova.compute.manager [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1003.065699] env[62277]: DEBUG nova.virt.hardware [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1003.065699] env[62277]: DEBUG nova.virt.hardware [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1003.065838] env[62277]: DEBUG nova.virt.hardware [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1003.066018] env[62277]: DEBUG nova.virt.hardware [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1003.066068] env[62277]: DEBUG nova.virt.hardware [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1003.066210] env[62277]: DEBUG nova.virt.hardware [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1003.070395] env[62277]: DEBUG nova.virt.hardware [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1003.070395] env[62277]: DEBUG nova.virt.hardware [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1003.070395] env[62277]: DEBUG nova.virt.hardware [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1003.070395] env[62277]: DEBUG nova.virt.hardware [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1003.070395] env[62277]: DEBUG nova.virt.hardware [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1003.070640] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6139acf3-fc1a-463f-8b88-23dee6b64c81 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1003.078107] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-934a97af-147f-4166-9740-81f02be92e4b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1003.093089] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Instance VIF info [] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1003.099124] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Creating folder: Project (e1511421e29e471f868aff5315f948bd). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1003.099261] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-23a09c25-2228-4802-968c-f15ea7be034c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1003.110467] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Created folder: Project (e1511421e29e471f868aff5315f948bd) in parent group-v297781. [ 1003.110467] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Creating folder: Instances. Parent ref: group-v297803. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1003.110467] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-89ef5b35-676d-4d53-ba4c-3b7d1bd9cf50 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1003.120395] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Created folder: Instances in parent group-v297803. [ 1003.120984] env[62277]: DEBUG oslo.service.loopingcall [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1003.121092] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1003.121271] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a9a71a09-84c0-4064-9252-7ddb09ea1f7e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1003.138740] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1003.138740] env[62277]: value = "task-1405307" [ 1003.138740] env[62277]: _type = "Task" [ 1003.138740] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1003.146473] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405307, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1003.534155] env[62277]: DEBUG nova.compute.manager [req-6576c314-f800-4e67-adb6-e398b2dc5c8c req-fc5fc1f4-e8ee-4b4f-a6af-84e26d7f7a7c service nova] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Received event network-vif-plugged-1ed02afb-746f-473b-828a-aae94ab6258f {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1003.534155] env[62277]: DEBUG oslo_concurrency.lockutils [req-6576c314-f800-4e67-adb6-e398b2dc5c8c req-fc5fc1f4-e8ee-4b4f-a6af-84e26d7f7a7c service nova] Acquiring lock "dfc291fd-1481-4e76-9fb3-ec87124c1281-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1003.536414] env[62277]: DEBUG oslo_concurrency.lockutils [req-6576c314-f800-4e67-adb6-e398b2dc5c8c req-fc5fc1f4-e8ee-4b4f-a6af-84e26d7f7a7c service nova] Lock "dfc291fd-1481-4e76-9fb3-ec87124c1281-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1003.539094] env[62277]: DEBUG oslo_concurrency.lockutils [req-6576c314-f800-4e67-adb6-e398b2dc5c8c req-fc5fc1f4-e8ee-4b4f-a6af-84e26d7f7a7c service nova] Lock "dfc291fd-1481-4e76-9fb3-ec87124c1281-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1003.539094] env[62277]: DEBUG nova.compute.manager [req-6576c314-f800-4e67-adb6-e398b2dc5c8c req-fc5fc1f4-e8ee-4b4f-a6af-84e26d7f7a7c service nova] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] No waiting events found dispatching network-vif-plugged-1ed02afb-746f-473b-828a-aae94ab6258f {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1003.539094] env[62277]: WARNING nova.compute.manager [req-6576c314-f800-4e67-adb6-e398b2dc5c8c req-fc5fc1f4-e8ee-4b4f-a6af-84e26d7f7a7c service nova] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Received unexpected event network-vif-plugged-1ed02afb-746f-473b-828a-aae94ab6258f for instance with vm_state building and task_state spawning. [ 1003.539094] env[62277]: DEBUG nova.compute.manager [req-6576c314-f800-4e67-adb6-e398b2dc5c8c req-fc5fc1f4-e8ee-4b4f-a6af-84e26d7f7a7c service nova] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Received event network-changed-1ed02afb-746f-473b-828a-aae94ab6258f {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1003.539429] env[62277]: DEBUG nova.compute.manager [req-6576c314-f800-4e67-adb6-e398b2dc5c8c req-fc5fc1f4-e8ee-4b4f-a6af-84e26d7f7a7c service nova] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Refreshing instance network info cache due to event network-changed-1ed02afb-746f-473b-828a-aae94ab6258f. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1003.539429] env[62277]: DEBUG oslo_concurrency.lockutils [req-6576c314-f800-4e67-adb6-e398b2dc5c8c req-fc5fc1f4-e8ee-4b4f-a6af-84e26d7f7a7c service nova] Acquiring lock "refresh_cache-dfc291fd-1481-4e76-9fb3-ec87124c1281" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1003.539429] env[62277]: DEBUG oslo_concurrency.lockutils [req-6576c314-f800-4e67-adb6-e398b2dc5c8c req-fc5fc1f4-e8ee-4b4f-a6af-84e26d7f7a7c service nova] Acquired lock "refresh_cache-dfc291fd-1481-4e76-9fb3-ec87124c1281" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1003.539429] env[62277]: DEBUG nova.network.neutron [req-6576c314-f800-4e67-adb6-e398b2dc5c8c req-fc5fc1f4-e8ee-4b4f-a6af-84e26d7f7a7c service nova] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Refreshing network info cache for port 1ed02afb-746f-473b-828a-aae94ab6258f {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1003.648301] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405307, 'name': CreateVM_Task, 'duration_secs': 0.284414} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1003.648641] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1003.649135] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1003.649343] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1003.649907] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1003.650224] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2421e8e5-4cfa-468f-afdb-c354fe1dd233 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1003.655724] env[62277]: DEBUG oslo_vmware.api [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Waiting for the task: (returnval){ [ 1003.655724] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52d53a1b-de0c-dede-be1f-2db3bd277a8a" [ 1003.655724] env[62277]: _type = "Task" [ 1003.655724] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1003.664183] env[62277]: DEBUG oslo_vmware.api [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52d53a1b-de0c-dede-be1f-2db3bd277a8a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1004.168914] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1004.169688] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1004.170099] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1004.239997] env[62277]: DEBUG nova.network.neutron [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Successfully updated port: 7d48e1c6-05b6-4a29-b3f3-b901df38433f {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1004.263289] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquiring lock "refresh_cache-36ff1435-1999-4e95-8920-81a1b25cc452" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1004.263481] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquired lock "refresh_cache-36ff1435-1999-4e95-8920-81a1b25cc452" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1004.263723] env[62277]: DEBUG nova.network.neutron [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1004.414232] env[62277]: DEBUG nova.network.neutron [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1004.655986] env[62277]: DEBUG nova.network.neutron [req-6576c314-f800-4e67-adb6-e398b2dc5c8c req-fc5fc1f4-e8ee-4b4f-a6af-84e26d7f7a7c service nova] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Updated VIF entry in instance network info cache for port 1ed02afb-746f-473b-828a-aae94ab6258f. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1004.656411] env[62277]: DEBUG nova.network.neutron [req-6576c314-f800-4e67-adb6-e398b2dc5c8c req-fc5fc1f4-e8ee-4b4f-a6af-84e26d7f7a7c service nova] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Updating instance_info_cache with network_info: [{"id": "1ed02afb-746f-473b-828a-aae94ab6258f", "address": "fa:16:3e:ba:1f:b3", "network": {"id": "712edd31-d6ed-48d2-989d-d16b2d30d012", "bridge": "br-int", "label": "tempest-ServersTestJSON-912307607-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "37c92b5660e84218890a498bbe1519b5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8b29df12-5674-476d-a9e5-5e20f704d224", "external-id": "nsx-vlan-transportzone-754", "segmentation_id": 754, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ed02afb-74", "ovs_interfaceid": "1ed02afb-746f-473b-828a-aae94ab6258f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1004.667202] env[62277]: DEBUG oslo_concurrency.lockutils [req-6576c314-f800-4e67-adb6-e398b2dc5c8c req-fc5fc1f4-e8ee-4b4f-a6af-84e26d7f7a7c service nova] Releasing lock "refresh_cache-dfc291fd-1481-4e76-9fb3-ec87124c1281" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1004.775491] env[62277]: DEBUG nova.compute.manager [req-d4ea07d8-6dc1-42c5-8ee3-6e1fd7c02ca8 req-dc23584f-84cf-4488-945d-5e8438a9c583 service nova] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Received event network-changed-c460565a-65c2-4cca-8ffa-2d386aa65882 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1004.775700] env[62277]: DEBUG nova.compute.manager [req-d4ea07d8-6dc1-42c5-8ee3-6e1fd7c02ca8 req-dc23584f-84cf-4488-945d-5e8438a9c583 service nova] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Refreshing instance network info cache due to event network-changed-c460565a-65c2-4cca-8ffa-2d386aa65882. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1004.775997] env[62277]: DEBUG oslo_concurrency.lockutils [req-d4ea07d8-6dc1-42c5-8ee3-6e1fd7c02ca8 req-dc23584f-84cf-4488-945d-5e8438a9c583 service nova] Acquiring lock "refresh_cache-19d15611-315f-4c4b-8f32-e5d00d0d8ca8" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1004.776154] env[62277]: DEBUG oslo_concurrency.lockutils [req-d4ea07d8-6dc1-42c5-8ee3-6e1fd7c02ca8 req-dc23584f-84cf-4488-945d-5e8438a9c583 service nova] Acquired lock "refresh_cache-19d15611-315f-4c4b-8f32-e5d00d0d8ca8" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1004.777358] env[62277]: DEBUG nova.network.neutron [req-d4ea07d8-6dc1-42c5-8ee3-6e1fd7c02ca8 req-dc23584f-84cf-4488-945d-5e8438a9c583 service nova] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Refreshing network info cache for port c460565a-65c2-4cca-8ffa-2d386aa65882 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1004.966705] env[62277]: DEBUG nova.network.neutron [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Updating instance_info_cache with network_info: [{"id": "7d48e1c6-05b6-4a29-b3f3-b901df38433f", "address": "fa:16:3e:78:e9:43", "network": {"id": "5a77f2c9-7fb5-47ff-a2b8-e3eef10df9c4", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-14441068-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f56098c52907445bb4675268403fe9f6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aef08290-001a-4ae8-aff0-1889e2211389", "external-id": "nsx-vlan-transportzone-389", "segmentation_id": 389, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7d48e1c6-05", "ovs_interfaceid": "7d48e1c6-05b6-4a29-b3f3-b901df38433f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1004.992050] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Releasing lock "refresh_cache-36ff1435-1999-4e95-8920-81a1b25cc452" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1004.992642] env[62277]: DEBUG nova.compute.manager [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Instance network_info: |[{"id": "7d48e1c6-05b6-4a29-b3f3-b901df38433f", "address": "fa:16:3e:78:e9:43", "network": {"id": "5a77f2c9-7fb5-47ff-a2b8-e3eef10df9c4", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-14441068-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f56098c52907445bb4675268403fe9f6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aef08290-001a-4ae8-aff0-1889e2211389", "external-id": "nsx-vlan-transportzone-389", "segmentation_id": 389, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7d48e1c6-05", "ovs_interfaceid": "7d48e1c6-05b6-4a29-b3f3-b901df38433f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1004.993183] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:78:e9:43', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'aef08290-001a-4ae8-aff0-1889e2211389', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7d48e1c6-05b6-4a29-b3f3-b901df38433f', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1005.002996] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Creating folder: Project (f56098c52907445bb4675268403fe9f6). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1005.002996] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-71a03c97-8ef5-4a85-8715-2365ed181a59 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.013881] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Created folder: Project (f56098c52907445bb4675268403fe9f6) in parent group-v297781. [ 1005.014108] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Creating folder: Instances. Parent ref: group-v297806. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1005.014363] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bd2f9301-ae0d-4196-90c8-ebdfe3dc629f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.024250] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Created folder: Instances in parent group-v297806. [ 1005.024500] env[62277]: DEBUG oslo.service.loopingcall [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1005.026321] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1005.029953] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3a619ecf-5cde-421c-a918-8e9b048581e4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.048554] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Acquiring lock "3862bece-d65f-4a89-b9fa-262ea01d10b9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1005.048787] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Lock "3862bece-d65f-4a89-b9fa-262ea01d10b9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1005.058966] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1005.058966] env[62277]: value = "task-1405311" [ 1005.058966] env[62277]: _type = "Task" [ 1005.058966] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1005.066582] env[62277]: DEBUG nova.compute.manager [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1005.076244] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405311, 'name': CreateVM_Task} progress is 6%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1005.144025] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1005.144276] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1005.147164] env[62277]: INFO nova.compute.claims [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1005.475024] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62f8932d-3ef0-4fad-bb60-ec8d7ace76be {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.489849] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2b7f0b3-74cf-4949-ab4e-96bbe491bc57 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.529369] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d43d186b-5144-4dec-bdb9-56f55b7c70e3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.539065] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-029c4998-589a-4a5b-ba62-7aedd22378f5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.555984] env[62277]: DEBUG nova.compute.provider_tree [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1005.567599] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405311, 'name': CreateVM_Task, 'duration_secs': 0.343552} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1005.568430] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1005.569239] env[62277]: DEBUG nova.scheduler.client.report [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1005.572604] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1005.572790] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1005.573110] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1005.574100] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-120d5076-7a16-463d-a45b-f5573daafc6c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.583545] env[62277]: DEBUG oslo_vmware.api [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Waiting for the task: (returnval){ [ 1005.583545] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52fbcaf2-0537-de13-4de6-8ada2e48965e" [ 1005.583545] env[62277]: _type = "Task" [ 1005.583545] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1005.587726] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.442s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1005.587726] env[62277]: DEBUG nova.compute.manager [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1005.596033] env[62277]: DEBUG oslo_vmware.api [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52fbcaf2-0537-de13-4de6-8ada2e48965e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1005.635390] env[62277]: DEBUG nova.compute.utils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1005.636620] env[62277]: DEBUG nova.compute.manager [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1005.636789] env[62277]: DEBUG nova.network.neutron [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1005.657188] env[62277]: DEBUG nova.compute.manager [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1005.734341] env[62277]: DEBUG nova.compute.manager [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1005.776907] env[62277]: DEBUG nova.virt.hardware [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1005.777193] env[62277]: DEBUG nova.virt.hardware [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1005.777352] env[62277]: DEBUG nova.virt.hardware [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1005.777522] env[62277]: DEBUG nova.virt.hardware [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1005.777663] env[62277]: DEBUG nova.virt.hardware [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1005.777805] env[62277]: DEBUG nova.virt.hardware [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1005.778029] env[62277]: DEBUG nova.virt.hardware [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1005.778362] env[62277]: DEBUG nova.virt.hardware [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1005.778580] env[62277]: DEBUG nova.virt.hardware [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1005.778747] env[62277]: DEBUG nova.virt.hardware [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1005.778940] env[62277]: DEBUG nova.virt.hardware [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1005.779849] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-138478ec-bbea-4e51-b015-95219b1b8cce {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.790512] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b841dee-725b-40ff-881c-9845e5a4e032 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1005.815427] env[62277]: DEBUG oslo_concurrency.lockutils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquiring lock "43255fee-e5da-4fe0-8fa7-4aba7592745b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1005.815648] env[62277]: DEBUG oslo_concurrency.lockutils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Lock "43255fee-e5da-4fe0-8fa7-4aba7592745b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1005.827678] env[62277]: DEBUG nova.compute.manager [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1005.899899] env[62277]: DEBUG oslo_concurrency.lockutils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1005.900145] env[62277]: DEBUG oslo_concurrency.lockutils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1005.903111] env[62277]: INFO nova.compute.claims [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1005.951764] env[62277]: DEBUG nova.policy [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2819fa58fa344ce9a1c899accbcc9143', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7584ecf374364f13ac45d342018ee2ca', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1005.991925] env[62277]: DEBUG nova.network.neutron [req-d4ea07d8-6dc1-42c5-8ee3-6e1fd7c02ca8 req-dc23584f-84cf-4488-945d-5e8438a9c583 service nova] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Updated VIF entry in instance network info cache for port c460565a-65c2-4cca-8ffa-2d386aa65882. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1005.992280] env[62277]: DEBUG nova.network.neutron [req-d4ea07d8-6dc1-42c5-8ee3-6e1fd7c02ca8 req-dc23584f-84cf-4488-945d-5e8438a9c583 service nova] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Updating instance_info_cache with network_info: [{"id": "c460565a-65c2-4cca-8ffa-2d386aa65882", "address": "fa:16:3e:75:52:26", "network": {"id": "eda6b8aa-c2b9-4c83-bc9a-09322648796f", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1860511403-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "29dbbf00030a407986371193ea850423", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "30c39e9a-a798-4f25-a48c-91f786ba332c", "external-id": "nsx-vlan-transportzone-438", "segmentation_id": 438, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc460565a-65", "ovs_interfaceid": "c460565a-65c2-4cca-8ffa-2d386aa65882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1006.006547] env[62277]: DEBUG oslo_concurrency.lockutils [req-d4ea07d8-6dc1-42c5-8ee3-6e1fd7c02ca8 req-dc23584f-84cf-4488-945d-5e8438a9c583 service nova] Releasing lock "refresh_cache-19d15611-315f-4c4b-8f32-e5d00d0d8ca8" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1006.100393] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1006.103967] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1006.103967] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1006.273531] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e12cd88-1674-46bf-87d5-6c3c8c2a1b0d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.283992] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7d9f798-5919-4287-9d77-fccde9ace1e0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.323827] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d4d0e98-7d4a-45b7-860b-69c68ca0d044 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.336437] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc26a894-98af-408f-ab8b-3d0b1b99513f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.355600] env[62277]: DEBUG nova.compute.provider_tree [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1006.372176] env[62277]: DEBUG nova.scheduler.client.report [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1006.395484] env[62277]: DEBUG oslo_concurrency.lockutils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.495s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1006.396111] env[62277]: DEBUG nova.compute.manager [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1006.479146] env[62277]: DEBUG nova.compute.utils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1006.480473] env[62277]: DEBUG nova.compute.manager [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1006.480617] env[62277]: DEBUG nova.network.neutron [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1006.499610] env[62277]: DEBUG nova.compute.manager [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1006.620212] env[62277]: DEBUG nova.compute.manager [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1006.667870] env[62277]: DEBUG nova.virt.hardware [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1006.668214] env[62277]: DEBUG nova.virt.hardware [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1006.668273] env[62277]: DEBUG nova.virt.hardware [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1006.668455] env[62277]: DEBUG nova.virt.hardware [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1006.668599] env[62277]: DEBUG nova.virt.hardware [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1006.668741] env[62277]: DEBUG nova.virt.hardware [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1006.668946] env[62277]: DEBUG nova.virt.hardware [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1006.669114] env[62277]: DEBUG nova.virt.hardware [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1006.669280] env[62277]: DEBUG nova.virt.hardware [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1006.669619] env[62277]: DEBUG nova.virt.hardware [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1006.669619] env[62277]: DEBUG nova.virt.hardware [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1006.670595] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a733326e-23b9-45db-a771-ef9e6f252075 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.680520] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11d073b8-86ae-4e6a-8eed-293a42552f28 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1006.860765] env[62277]: DEBUG nova.policy [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0ec8b5ed58474c4b874c1231ac8c92e1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f56098c52907445bb4675268403fe9f6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1007.047099] env[62277]: DEBUG nova.network.neutron [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Successfully created port: 7b56a287-90ae-418a-9eb4-58076a3abcd2 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1007.535661] env[62277]: DEBUG nova.network.neutron [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Successfully created port: 1b6f17ee-0249-4cf6-87cf-3ccbc25e322b {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1007.997795] env[62277]: DEBUG nova.compute.manager [req-91ca7ae9-79b3-4f50-9e40-a355ada0272b req-05331bd0-a681-465f-889a-b02d621d5c08 service nova] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Received event network-vif-plugged-7d48e1c6-05b6-4a29-b3f3-b901df38433f {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1007.998064] env[62277]: DEBUG oslo_concurrency.lockutils [req-91ca7ae9-79b3-4f50-9e40-a355ada0272b req-05331bd0-a681-465f-889a-b02d621d5c08 service nova] Acquiring lock "36ff1435-1999-4e95-8920-81a1b25cc452-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1007.998339] env[62277]: DEBUG oslo_concurrency.lockutils [req-91ca7ae9-79b3-4f50-9e40-a355ada0272b req-05331bd0-a681-465f-889a-b02d621d5c08 service nova] Lock "36ff1435-1999-4e95-8920-81a1b25cc452-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1007.998418] env[62277]: DEBUG oslo_concurrency.lockutils [req-91ca7ae9-79b3-4f50-9e40-a355ada0272b req-05331bd0-a681-465f-889a-b02d621d5c08 service nova] Lock "36ff1435-1999-4e95-8920-81a1b25cc452-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1007.998573] env[62277]: DEBUG nova.compute.manager [req-91ca7ae9-79b3-4f50-9e40-a355ada0272b req-05331bd0-a681-465f-889a-b02d621d5c08 service nova] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] No waiting events found dispatching network-vif-plugged-7d48e1c6-05b6-4a29-b3f3-b901df38433f {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1007.998701] env[62277]: WARNING nova.compute.manager [req-91ca7ae9-79b3-4f50-9e40-a355ada0272b req-05331bd0-a681-465f-889a-b02d621d5c08 service nova] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Received unexpected event network-vif-plugged-7d48e1c6-05b6-4a29-b3f3-b901df38433f for instance with vm_state building and task_state spawning. [ 1007.999067] env[62277]: DEBUG nova.compute.manager [req-91ca7ae9-79b3-4f50-9e40-a355ada0272b req-05331bd0-a681-465f-889a-b02d621d5c08 service nova] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Received event network-changed-7d48e1c6-05b6-4a29-b3f3-b901df38433f {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1008.003441] env[62277]: DEBUG nova.compute.manager [req-91ca7ae9-79b3-4f50-9e40-a355ada0272b req-05331bd0-a681-465f-889a-b02d621d5c08 service nova] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Refreshing instance network info cache due to event network-changed-7d48e1c6-05b6-4a29-b3f3-b901df38433f. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1008.003441] env[62277]: DEBUG oslo_concurrency.lockutils [req-91ca7ae9-79b3-4f50-9e40-a355ada0272b req-05331bd0-a681-465f-889a-b02d621d5c08 service nova] Acquiring lock "refresh_cache-36ff1435-1999-4e95-8920-81a1b25cc452" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1008.003441] env[62277]: DEBUG oslo_concurrency.lockutils [req-91ca7ae9-79b3-4f50-9e40-a355ada0272b req-05331bd0-a681-465f-889a-b02d621d5c08 service nova] Acquired lock "refresh_cache-36ff1435-1999-4e95-8920-81a1b25cc452" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1008.003441] env[62277]: DEBUG nova.network.neutron [req-91ca7ae9-79b3-4f50-9e40-a355ada0272b req-05331bd0-a681-465f-889a-b02d621d5c08 service nova] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Refreshing network info cache for port 7d48e1c6-05b6-4a29-b3f3-b901df38433f {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1008.744251] env[62277]: DEBUG nova.network.neutron [req-91ca7ae9-79b3-4f50-9e40-a355ada0272b req-05331bd0-a681-465f-889a-b02d621d5c08 service nova] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Updated VIF entry in instance network info cache for port 7d48e1c6-05b6-4a29-b3f3-b901df38433f. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1008.744251] env[62277]: DEBUG nova.network.neutron [req-91ca7ae9-79b3-4f50-9e40-a355ada0272b req-05331bd0-a681-465f-889a-b02d621d5c08 service nova] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Updating instance_info_cache with network_info: [{"id": "7d48e1c6-05b6-4a29-b3f3-b901df38433f", "address": "fa:16:3e:78:e9:43", "network": {"id": "5a77f2c9-7fb5-47ff-a2b8-e3eef10df9c4", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-14441068-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f56098c52907445bb4675268403fe9f6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aef08290-001a-4ae8-aff0-1889e2211389", "external-id": "nsx-vlan-transportzone-389", "segmentation_id": 389, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7d48e1c6-05", "ovs_interfaceid": "7d48e1c6-05b6-4a29-b3f3-b901df38433f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1008.754568] env[62277]: DEBUG oslo_concurrency.lockutils [req-91ca7ae9-79b3-4f50-9e40-a355ada0272b req-05331bd0-a681-465f-889a-b02d621d5c08 service nova] Releasing lock "refresh_cache-36ff1435-1999-4e95-8920-81a1b25cc452" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1008.886112] env[62277]: DEBUG nova.network.neutron [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Successfully updated port: 7b56a287-90ae-418a-9eb4-58076a3abcd2 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1008.901451] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Acquiring lock "refresh_cache-3862bece-d65f-4a89-b9fa-262ea01d10b9" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1008.903951] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Acquired lock "refresh_cache-3862bece-d65f-4a89-b9fa-262ea01d10b9" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1008.903951] env[62277]: DEBUG nova.network.neutron [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1008.999469] env[62277]: DEBUG nova.network.neutron [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1009.477011] env[62277]: DEBUG nova.network.neutron [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Updating instance_info_cache with network_info: [{"id": "7b56a287-90ae-418a-9eb4-58076a3abcd2", "address": "fa:16:3e:be:03:6c", "network": {"id": "b2c1f1eb-ee37-4d67-9ec4-abfe7cc1573f", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1172561999-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7584ecf374364f13ac45d342018ee2ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b80dd748-3d7e-4a23-a38d-9e79a3881452", "external-id": "nsx-vlan-transportzone-497", "segmentation_id": 497, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7b56a287-90", "ovs_interfaceid": "7b56a287-90ae-418a-9eb4-58076a3abcd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1009.496941] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Releasing lock "refresh_cache-3862bece-d65f-4a89-b9fa-262ea01d10b9" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1009.497268] env[62277]: DEBUG nova.compute.manager [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Instance network_info: |[{"id": "7b56a287-90ae-418a-9eb4-58076a3abcd2", "address": "fa:16:3e:be:03:6c", "network": {"id": "b2c1f1eb-ee37-4d67-9ec4-abfe7cc1573f", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1172561999-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7584ecf374364f13ac45d342018ee2ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b80dd748-3d7e-4a23-a38d-9e79a3881452", "external-id": "nsx-vlan-transportzone-497", "segmentation_id": 497, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7b56a287-90", "ovs_interfaceid": "7b56a287-90ae-418a-9eb4-58076a3abcd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1009.497661] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:be:03:6c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'b80dd748-3d7e-4a23-a38d-9e79a3881452', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7b56a287-90ae-418a-9eb4-58076a3abcd2', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1009.507984] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Creating folder: Project (7584ecf374364f13ac45d342018ee2ca). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1009.508317] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6c97aeee-bb13-4bd0-9ddb-fa61c68d1be8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1009.515974] env[62277]: DEBUG nova.network.neutron [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Successfully updated port: 1b6f17ee-0249-4cf6-87cf-3ccbc25e322b {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1009.522685] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Created folder: Project (7584ecf374364f13ac45d342018ee2ca) in parent group-v297781. [ 1009.522685] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Creating folder: Instances. Parent ref: group-v297809. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1009.522685] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7ab04607-8642-4479-b8ad-c4731c64fb0d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1009.533350] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Created folder: Instances in parent group-v297809. [ 1009.533468] env[62277]: DEBUG oslo.service.loopingcall [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1009.536302] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1009.536302] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9ce12d07-0402-4d25-9d43-28a5fd381eb3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1009.551165] env[62277]: DEBUG oslo_concurrency.lockutils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquiring lock "refresh_cache-43255fee-e5da-4fe0-8fa7-4aba7592745b" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1009.551401] env[62277]: DEBUG oslo_concurrency.lockutils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquired lock "refresh_cache-43255fee-e5da-4fe0-8fa7-4aba7592745b" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1009.551477] env[62277]: DEBUG nova.network.neutron [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1009.563507] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1009.563507] env[62277]: value = "task-1405316" [ 1009.563507] env[62277]: _type = "Task" [ 1009.563507] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1009.572583] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405316, 'name': CreateVM_Task} progress is 5%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1009.619959] env[62277]: DEBUG nova.network.neutron [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1009.725885] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Acquiring lock "866c4415-caab-4d81-86ba-ed662feb3c4f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1009.725885] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Lock "866c4415-caab-4d81-86ba-ed662feb3c4f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1010.072086] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405316, 'name': CreateVM_Task, 'duration_secs': 0.379495} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1010.072439] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1010.073411] env[62277]: DEBUG oslo_vmware.service [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcc3ce4a-38a9-4d8b-8cc5-b48a70cd722a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1010.079343] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1010.079707] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1010.079895] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1010.080155] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-64823636-02d8-476b-acaf-35e9c7b020ec {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1010.088483] env[62277]: DEBUG oslo_vmware.api [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Waiting for the task: (returnval){ [ 1010.088483] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]520a8077-6d6d-9172-16e4-2a24e6e8ac44" [ 1010.088483] env[62277]: _type = "Task" [ 1010.088483] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1010.098214] env[62277]: DEBUG oslo_vmware.api [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]520a8077-6d6d-9172-16e4-2a24e6e8ac44, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1010.219815] env[62277]: DEBUG nova.network.neutron [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Updating instance_info_cache with network_info: [{"id": "1b6f17ee-0249-4cf6-87cf-3ccbc25e322b", "address": "fa:16:3e:b5:4c:ec", "network": {"id": "5a77f2c9-7fb5-47ff-a2b8-e3eef10df9c4", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-14441068-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f56098c52907445bb4675268403fe9f6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aef08290-001a-4ae8-aff0-1889e2211389", "external-id": "nsx-vlan-transportzone-389", "segmentation_id": 389, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1b6f17ee-02", "ovs_interfaceid": "1b6f17ee-0249-4cf6-87cf-3ccbc25e322b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1010.244975] env[62277]: DEBUG oslo_concurrency.lockutils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Releasing lock "refresh_cache-43255fee-e5da-4fe0-8fa7-4aba7592745b" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1010.246519] env[62277]: DEBUG nova.compute.manager [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Instance network_info: |[{"id": "1b6f17ee-0249-4cf6-87cf-3ccbc25e322b", "address": "fa:16:3e:b5:4c:ec", "network": {"id": "5a77f2c9-7fb5-47ff-a2b8-e3eef10df9c4", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-14441068-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f56098c52907445bb4675268403fe9f6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aef08290-001a-4ae8-aff0-1889e2211389", "external-id": "nsx-vlan-transportzone-389", "segmentation_id": 389, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1b6f17ee-02", "ovs_interfaceid": "1b6f17ee-0249-4cf6-87cf-3ccbc25e322b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1010.246866] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b5:4c:ec', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'aef08290-001a-4ae8-aff0-1889e2211389', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1b6f17ee-0249-4cf6-87cf-3ccbc25e322b', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1010.259761] env[62277]: DEBUG oslo.service.loopingcall [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1010.260943] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1010.261191] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3c34d437-c37e-4f02-8c98-e6d087fa233d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1010.289901] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1010.289901] env[62277]: value = "task-1405317" [ 1010.289901] env[62277]: _type = "Task" [ 1010.289901] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1010.300157] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405317, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1010.602049] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1010.602475] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1010.602829] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1010.603112] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1010.603414] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1010.603778] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d82586c2-9d83-482b-b9df-ed5aaedc8af3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1010.611776] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1010.613802] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1010.613802] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e5c0d3d-13a5-4b05-b6ff-229cdd911e1c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1010.621674] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8567a8e6-3f1e-489e-805b-e8f1f8422a68 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1010.627372] env[62277]: DEBUG oslo_vmware.api [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Waiting for the task: (returnval){ [ 1010.627372] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52450c10-0fbb-9609-dcac-6d9fe4119cb6" [ 1010.627372] env[62277]: _type = "Task" [ 1010.627372] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1010.640162] env[62277]: DEBUG oslo_vmware.api [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52450c10-0fbb-9609-dcac-6d9fe4119cb6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1010.800594] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405317, 'name': CreateVM_Task, 'duration_secs': 0.324148} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1010.800594] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1010.801042] env[62277]: DEBUG oslo_concurrency.lockutils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1010.801042] env[62277]: DEBUG oslo_concurrency.lockutils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1010.801574] env[62277]: DEBUG oslo_concurrency.lockutils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1010.801574] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b677dbd5-f6ba-4168-a92b-98a2f0e30c26 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1010.806614] env[62277]: DEBUG oslo_vmware.api [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Waiting for the task: (returnval){ [ 1010.806614] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52f2b773-d149-4d4c-53c9-fcaf89c11589" [ 1010.806614] env[62277]: _type = "Task" [ 1010.806614] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1010.820754] env[62277]: DEBUG oslo_vmware.api [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52f2b773-d149-4d4c-53c9-fcaf89c11589, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1011.139258] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1011.139524] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Creating directory with path [datastore1] vmware_temp/2c2ab30f-13c6-447b-88b9-42754657c9fa/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1011.139682] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f6027084-4ac9-4d2c-991f-52aabb6d6200 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.157948] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Created directory with path [datastore1] vmware_temp/2c2ab30f-13c6-447b-88b9-42754657c9fa/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1011.158177] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Fetch image to [datastore1] vmware_temp/2c2ab30f-13c6-447b-88b9-42754657c9fa/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1011.158345] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore1] vmware_temp/2c2ab30f-13c6-447b-88b9-42754657c9fa/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore1 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1011.159188] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86ee6ea4-7c72-415b-873d-c8f9677d25ad {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.172120] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2aab0c3-8e91-4fac-b745-5f95eeeb243b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.185875] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1545d6ff-0c74-48a0-af41-09492dbf05b1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.223802] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dd62611-2ef5-4ab9-902e-164045ca741f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.230759] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cc852e26-8df8-49ed-9b00-37738d9603ac {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1011.319369] env[62277]: DEBUG oslo_concurrency.lockutils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1011.319639] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1011.319845] env[62277]: DEBUG oslo_concurrency.lockutils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1011.326159] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore1 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1011.408926] env[62277]: DEBUG oslo_vmware.rw_handles [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2c2ab30f-13c6-447b-88b9-42754657c9fa/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1011.475204] env[62277]: DEBUG oslo_vmware.rw_handles [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1011.475408] env[62277]: DEBUG oslo_vmware.rw_handles [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2c2ab30f-13c6-447b-88b9-42754657c9fa/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1011.791593] env[62277]: DEBUG nova.compute.manager [req-8095470f-a9dc-4dc0-a34b-f351ff002832 req-315d3546-c318-4e68-8b8d-fb796d827704 service nova] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Received event network-vif-plugged-1b6f17ee-0249-4cf6-87cf-3ccbc25e322b {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1011.792268] env[62277]: DEBUG oslo_concurrency.lockutils [req-8095470f-a9dc-4dc0-a34b-f351ff002832 req-315d3546-c318-4e68-8b8d-fb796d827704 service nova] Acquiring lock "43255fee-e5da-4fe0-8fa7-4aba7592745b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1011.792528] env[62277]: DEBUG oslo_concurrency.lockutils [req-8095470f-a9dc-4dc0-a34b-f351ff002832 req-315d3546-c318-4e68-8b8d-fb796d827704 service nova] Lock "43255fee-e5da-4fe0-8fa7-4aba7592745b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1011.793412] env[62277]: DEBUG oslo_concurrency.lockutils [req-8095470f-a9dc-4dc0-a34b-f351ff002832 req-315d3546-c318-4e68-8b8d-fb796d827704 service nova] Lock "43255fee-e5da-4fe0-8fa7-4aba7592745b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1011.793603] env[62277]: DEBUG nova.compute.manager [req-8095470f-a9dc-4dc0-a34b-f351ff002832 req-315d3546-c318-4e68-8b8d-fb796d827704 service nova] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] No waiting events found dispatching network-vif-plugged-1b6f17ee-0249-4cf6-87cf-3ccbc25e322b {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1011.793774] env[62277]: WARNING nova.compute.manager [req-8095470f-a9dc-4dc0-a34b-f351ff002832 req-315d3546-c318-4e68-8b8d-fb796d827704 service nova] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Received unexpected event network-vif-plugged-1b6f17ee-0249-4cf6-87cf-3ccbc25e322b for instance with vm_state building and task_state spawning. [ 1011.983330] env[62277]: DEBUG oslo_concurrency.lockutils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Acquiring lock "350e2302-66b9-4dd6-b0f4-77000992408b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1011.983330] env[62277]: DEBUG oslo_concurrency.lockutils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Lock "350e2302-66b9-4dd6-b0f4-77000992408b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1012.031474] env[62277]: DEBUG nova.compute.manager [req-1e80c5f1-1756-47cb-a6e4-19a5ce8ec8cf req-24f9675e-fbb7-4720-9270-a92ef1112ef2 service nova] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Received event network-changed-1b6f17ee-0249-4cf6-87cf-3ccbc25e322b {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1012.031637] env[62277]: DEBUG nova.compute.manager [req-1e80c5f1-1756-47cb-a6e4-19a5ce8ec8cf req-24f9675e-fbb7-4720-9270-a92ef1112ef2 service nova] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Refreshing instance network info cache due to event network-changed-1b6f17ee-0249-4cf6-87cf-3ccbc25e322b. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1012.031874] env[62277]: DEBUG oslo_concurrency.lockutils [req-1e80c5f1-1756-47cb-a6e4-19a5ce8ec8cf req-24f9675e-fbb7-4720-9270-a92ef1112ef2 service nova] Acquiring lock "refresh_cache-43255fee-e5da-4fe0-8fa7-4aba7592745b" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1012.032054] env[62277]: DEBUG oslo_concurrency.lockutils [req-1e80c5f1-1756-47cb-a6e4-19a5ce8ec8cf req-24f9675e-fbb7-4720-9270-a92ef1112ef2 service nova] Acquired lock "refresh_cache-43255fee-e5da-4fe0-8fa7-4aba7592745b" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1012.032218] env[62277]: DEBUG nova.network.neutron [req-1e80c5f1-1756-47cb-a6e4-19a5ce8ec8cf req-24f9675e-fbb7-4720-9270-a92ef1112ef2 service nova] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Refreshing network info cache for port 1b6f17ee-0249-4cf6-87cf-3ccbc25e322b {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1012.371707] env[62277]: DEBUG nova.compute.manager [req-c44fc237-42f8-452d-b017-a7d529e3ee53 req-5d7fc49b-168a-4274-9ed1-ca33f94b921a service nova] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Received event network-vif-plugged-7b56a287-90ae-418a-9eb4-58076a3abcd2 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1012.372200] env[62277]: DEBUG oslo_concurrency.lockutils [req-c44fc237-42f8-452d-b017-a7d529e3ee53 req-5d7fc49b-168a-4274-9ed1-ca33f94b921a service nova] Acquiring lock "3862bece-d65f-4a89-b9fa-262ea01d10b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1012.372501] env[62277]: DEBUG oslo_concurrency.lockutils [req-c44fc237-42f8-452d-b017-a7d529e3ee53 req-5d7fc49b-168a-4274-9ed1-ca33f94b921a service nova] Lock "3862bece-d65f-4a89-b9fa-262ea01d10b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1012.372644] env[62277]: DEBUG oslo_concurrency.lockutils [req-c44fc237-42f8-452d-b017-a7d529e3ee53 req-5d7fc49b-168a-4274-9ed1-ca33f94b921a service nova] Lock "3862bece-d65f-4a89-b9fa-262ea01d10b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1012.372899] env[62277]: DEBUG nova.compute.manager [req-c44fc237-42f8-452d-b017-a7d529e3ee53 req-5d7fc49b-168a-4274-9ed1-ca33f94b921a service nova] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] No waiting events found dispatching network-vif-plugged-7b56a287-90ae-418a-9eb4-58076a3abcd2 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1012.373155] env[62277]: WARNING nova.compute.manager [req-c44fc237-42f8-452d-b017-a7d529e3ee53 req-5d7fc49b-168a-4274-9ed1-ca33f94b921a service nova] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Received unexpected event network-vif-plugged-7b56a287-90ae-418a-9eb4-58076a3abcd2 for instance with vm_state building and task_state spawning. [ 1012.373359] env[62277]: DEBUG nova.compute.manager [req-c44fc237-42f8-452d-b017-a7d529e3ee53 req-5d7fc49b-168a-4274-9ed1-ca33f94b921a service nova] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Received event network-changed-7b56a287-90ae-418a-9eb4-58076a3abcd2 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1012.373562] env[62277]: DEBUG nova.compute.manager [req-c44fc237-42f8-452d-b017-a7d529e3ee53 req-5d7fc49b-168a-4274-9ed1-ca33f94b921a service nova] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Refreshing instance network info cache due to event network-changed-7b56a287-90ae-418a-9eb4-58076a3abcd2. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1012.373794] env[62277]: DEBUG oslo_concurrency.lockutils [req-c44fc237-42f8-452d-b017-a7d529e3ee53 req-5d7fc49b-168a-4274-9ed1-ca33f94b921a service nova] Acquiring lock "refresh_cache-3862bece-d65f-4a89-b9fa-262ea01d10b9" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1012.374084] env[62277]: DEBUG oslo_concurrency.lockutils [req-c44fc237-42f8-452d-b017-a7d529e3ee53 req-5d7fc49b-168a-4274-9ed1-ca33f94b921a service nova] Acquired lock "refresh_cache-3862bece-d65f-4a89-b9fa-262ea01d10b9" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1012.374169] env[62277]: DEBUG nova.network.neutron [req-c44fc237-42f8-452d-b017-a7d529e3ee53 req-5d7fc49b-168a-4274-9ed1-ca33f94b921a service nova] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Refreshing network info cache for port 7b56a287-90ae-418a-9eb4-58076a3abcd2 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1012.674366] env[62277]: DEBUG nova.network.neutron [req-1e80c5f1-1756-47cb-a6e4-19a5ce8ec8cf req-24f9675e-fbb7-4720-9270-a92ef1112ef2 service nova] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Updated VIF entry in instance network info cache for port 1b6f17ee-0249-4cf6-87cf-3ccbc25e322b. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1012.674366] env[62277]: DEBUG nova.network.neutron [req-1e80c5f1-1756-47cb-a6e4-19a5ce8ec8cf req-24f9675e-fbb7-4720-9270-a92ef1112ef2 service nova] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Updating instance_info_cache with network_info: [{"id": "1b6f17ee-0249-4cf6-87cf-3ccbc25e322b", "address": "fa:16:3e:b5:4c:ec", "network": {"id": "5a77f2c9-7fb5-47ff-a2b8-e3eef10df9c4", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-14441068-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f56098c52907445bb4675268403fe9f6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aef08290-001a-4ae8-aff0-1889e2211389", "external-id": "nsx-vlan-transportzone-389", "segmentation_id": 389, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1b6f17ee-02", "ovs_interfaceid": "1b6f17ee-0249-4cf6-87cf-3ccbc25e322b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1012.688552] env[62277]: DEBUG oslo_concurrency.lockutils [req-1e80c5f1-1756-47cb-a6e4-19a5ce8ec8cf req-24f9675e-fbb7-4720-9270-a92ef1112ef2 service nova] Releasing lock "refresh_cache-43255fee-e5da-4fe0-8fa7-4aba7592745b" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1012.892951] env[62277]: DEBUG nova.network.neutron [req-c44fc237-42f8-452d-b017-a7d529e3ee53 req-5d7fc49b-168a-4274-9ed1-ca33f94b921a service nova] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Updated VIF entry in instance network info cache for port 7b56a287-90ae-418a-9eb4-58076a3abcd2. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1012.893337] env[62277]: DEBUG nova.network.neutron [req-c44fc237-42f8-452d-b017-a7d529e3ee53 req-5d7fc49b-168a-4274-9ed1-ca33f94b921a service nova] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Updating instance_info_cache with network_info: [{"id": "7b56a287-90ae-418a-9eb4-58076a3abcd2", "address": "fa:16:3e:be:03:6c", "network": {"id": "b2c1f1eb-ee37-4d67-9ec4-abfe7cc1573f", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1172561999-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7584ecf374364f13ac45d342018ee2ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b80dd748-3d7e-4a23-a38d-9e79a3881452", "external-id": "nsx-vlan-transportzone-497", "segmentation_id": 497, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7b56a287-90", "ovs_interfaceid": "7b56a287-90ae-418a-9eb4-58076a3abcd2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1012.907351] env[62277]: DEBUG oslo_concurrency.lockutils [req-c44fc237-42f8-452d-b017-a7d529e3ee53 req-5d7fc49b-168a-4274-9ed1-ca33f94b921a service nova] Releasing lock "refresh_cache-3862bece-d65f-4a89-b9fa-262ea01d10b9" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1014.006376] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Acquiring lock "23bc5a48-3e96-4897-bf28-ad14a0bdde62" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1014.006376] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Lock "23bc5a48-3e96-4897-bf28-ad14a0bdde62" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1015.489148] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Acquiring lock "346748bd-b4e8-4e93-b71d-66c90a45e372" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1015.489148] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Lock "346748bd-b4e8-4e93-b71d-66c90a45e372" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1021.428790] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Acquiring lock "930ff058-ab48-4c8a-8f5e-4820a3b12d50" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1021.429126] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Lock "930ff058-ab48-4c8a-8f5e-4820a3b12d50" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1023.025293] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Acquiring lock "154fa64c-55d4-4b72-8af9-39e72fd5df5f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1023.026678] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Lock "154fa64c-55d4-4b72-8af9-39e72fd5df5f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1023.622578] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Acquiring lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1023.623593] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1025.436485] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ffd72be3-f106-43c5-b3e1-558211d8f7ca tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] Acquiring lock "79177b7a-1bf6-4649-80c7-4ba2c6cda0ad" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1025.436799] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ffd72be3-f106-43c5-b3e1-558211d8f7ca tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] Lock "79177b7a-1bf6-4649-80c7-4ba2c6cda0ad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1025.470069] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ffd72be3-f106-43c5-b3e1-558211d8f7ca tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] Acquiring lock "4cc3443d-3d05-4d3b-a222-6b7367c1c989" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1025.470349] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ffd72be3-f106-43c5-b3e1-558211d8f7ca tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] Lock "4cc3443d-3d05-4d3b-a222-6b7367c1c989" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1026.141387] env[62277]: DEBUG oslo_concurrency.lockutils [None req-34538215-56b8-42d9-8162-924ac491e844 tempest-ServerMetadataTestJSON-1737770471 tempest-ServerMetadataTestJSON-1737770471-project-member] Acquiring lock "04122dba-4cbf-4176-95bd-f1c4bfaa799e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1026.141654] env[62277]: DEBUG oslo_concurrency.lockutils [None req-34538215-56b8-42d9-8162-924ac491e844 tempest-ServerMetadataTestJSON-1737770471 tempest-ServerMetadataTestJSON-1737770471-project-member] Lock "04122dba-4cbf-4176-95bd-f1c4bfaa799e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1029.096897] env[62277]: WARNING oslo_vmware.rw_handles [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1029.096897] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1029.096897] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1029.096897] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1029.096897] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1029.096897] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1029.096897] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1029.096897] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1029.096897] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1029.096897] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1029.096897] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1029.096897] env[62277]: ERROR oslo_vmware.rw_handles [ 1029.097638] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/80e86d35-69df-4ee4-b932-2bfa3caa4df1/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1029.098647] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1029.098890] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Copying Virtual Disk [datastore2] vmware_temp/80e86d35-69df-4ee4-b932-2bfa3caa4df1/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/80e86d35-69df-4ee4-b932-2bfa3caa4df1/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1029.099200] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-436d9653-d948-4acc-8111-9053f072e716 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1029.114296] env[62277]: DEBUG oslo_vmware.api [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Waiting for the task: (returnval){ [ 1029.114296] env[62277]: value = "task-1405322" [ 1029.114296] env[62277]: _type = "Task" [ 1029.114296] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1029.127281] env[62277]: DEBUG oslo_vmware.api [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Task: {'id': task-1405322, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1029.626250] env[62277]: DEBUG oslo_vmware.exceptions [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1029.626744] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1029.630901] env[62277]: ERROR nova.compute.manager [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1029.630901] env[62277]: Faults: ['InvalidArgument'] [ 1029.630901] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Traceback (most recent call last): [ 1029.630901] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1029.630901] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] yield resources [ 1029.630901] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1029.630901] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] self.driver.spawn(context, instance, image_meta, [ 1029.630901] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1029.630901] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1029.630901] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1029.630901] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] self._fetch_image_if_missing(context, vi) [ 1029.630901] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1029.631477] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] image_cache(vi, tmp_image_ds_loc) [ 1029.631477] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1029.631477] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] vm_util.copy_virtual_disk( [ 1029.631477] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1029.631477] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] session._wait_for_task(vmdk_copy_task) [ 1029.631477] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1029.631477] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] return self.wait_for_task(task_ref) [ 1029.631477] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1029.631477] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] return evt.wait() [ 1029.631477] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1029.631477] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] result = hub.switch() [ 1029.631477] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1029.631477] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] return self.greenlet.switch() [ 1029.631974] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1029.631974] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] self.f(*self.args, **self.kw) [ 1029.631974] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1029.631974] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] raise exceptions.translate_fault(task_info.error) [ 1029.631974] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1029.631974] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Faults: ['InvalidArgument'] [ 1029.631974] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] [ 1029.631974] env[62277]: INFO nova.compute.manager [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Terminating instance [ 1029.633780] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Acquiring lock "refresh_cache-6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1029.633937] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Acquired lock "refresh_cache-6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1029.634260] env[62277]: DEBUG nova.network.neutron [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1029.635407] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1029.635611] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1029.636084] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c0f9a7da-7bcb-4b8f-8e85-dccc1040e35c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1029.647023] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1029.647305] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1029.654071] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cc5a8da0-d42a-499a-a906-0dfb89a32f78 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1029.663616] env[62277]: DEBUG oslo_vmware.api [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Waiting for the task: (returnval){ [ 1029.663616] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52ccb00d-93ba-1a03-8de3-7b486f575cb0" [ 1029.663616] env[62277]: _type = "Task" [ 1029.663616] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1029.672047] env[62277]: DEBUG oslo_vmware.api [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52ccb00d-93ba-1a03-8de3-7b486f575cb0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1029.684224] env[62277]: DEBUG nova.network.neutron [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1029.826993] env[62277]: DEBUG nova.network.neutron [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1029.836498] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Releasing lock "refresh_cache-6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1029.836678] env[62277]: DEBUG nova.compute.manager [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1029.836864] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1029.838022] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95220e58-1359-4b79-92af-5fa6a62778aa {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1029.846757] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1029.846998] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b56d9cb0-f1aa-4e06-99e2-2e49246be522 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1029.882085] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1029.882370] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1029.882491] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Deleting the datastore file [datastore2] 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1029.882740] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-53745378-20bc-4abb-a4b4-b1e216604616 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1029.897979] env[62277]: DEBUG oslo_vmware.api [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Waiting for the task: (returnval){ [ 1029.897979] env[62277]: value = "task-1405324" [ 1029.897979] env[62277]: _type = "Task" [ 1029.897979] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1029.914181] env[62277]: DEBUG oslo_vmware.api [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Task: {'id': task-1405324, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1030.180450] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1030.180450] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Creating directory with path [datastore2] vmware_temp/7b9d2a9f-81cb-4e3d-a4af-c0c56bc1b793/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1030.180738] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-72a2ac07-fbf7-4c06-90de-5fc4cb556baf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1030.192073] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Created directory with path [datastore2] vmware_temp/7b9d2a9f-81cb-4e3d-a4af-c0c56bc1b793/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1030.192299] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Fetch image to [datastore2] vmware_temp/7b9d2a9f-81cb-4e3d-a4af-c0c56bc1b793/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1030.193155] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/7b9d2a9f-81cb-4e3d-a4af-c0c56bc1b793/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1030.195966] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6de7234f-3dbd-4d0b-adb5-9c348c645a11 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1030.202582] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e02bcb3-936c-4f87-9b7f-dadfd3a78295 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1030.214471] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf73129f-eaf2-47dc-945e-c4e7a3b1cd21 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1030.268806] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa54ed1b-96ef-410d-a1b2-5714c4d23930 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1030.272972] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f6182b83-f1e1-4d2b-91e7-5f82871cf9d7 tempest-FloatingIPsAssociationNegativeTestJSON-1284752111 tempest-FloatingIPsAssociationNegativeTestJSON-1284752111-project-member] Acquiring lock "30d7d279-7241-4f2e-b963-d205a5f9fa41" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1030.273696] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f6182b83-f1e1-4d2b-91e7-5f82871cf9d7 tempest-FloatingIPsAssociationNegativeTestJSON-1284752111 tempest-FloatingIPsAssociationNegativeTestJSON-1284752111-project-member] Lock "30d7d279-7241-4f2e-b963-d205a5f9fa41" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1030.277584] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-72b09e68-9a6a-47a6-be12-89f9e836a48b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1030.298620] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1030.374881] env[62277]: DEBUG oslo_vmware.rw_handles [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7b9d2a9f-81cb-4e3d-a4af-c0c56bc1b793/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1030.441525] env[62277]: DEBUG oslo_vmware.rw_handles [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1030.441727] env[62277]: DEBUG oslo_vmware.rw_handles [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7b9d2a9f-81cb-4e3d-a4af-c0c56bc1b793/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1030.445552] env[62277]: DEBUG oslo_vmware.api [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Task: {'id': task-1405324, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.045723} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1030.445784] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1030.445960] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1030.446135] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1030.446592] env[62277]: INFO nova.compute.manager [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1030.446835] env[62277]: DEBUG oslo.service.loopingcall [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1030.447073] env[62277]: DEBUG nova.compute.manager [-] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Skipping network deallocation for instance since networking was not requested. {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1030.450184] env[62277]: DEBUG nova.compute.claims [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1030.450361] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1030.450567] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1030.922679] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c1dfa0a-d7cd-4bd5-946e-b21429479aab {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1030.932930] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e056e5d-dd36-49f0-b42b-2c9d4731e5ac {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1030.965811] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bbad6e0-7a24-49c5-a997-eca7355e9f96 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1030.973911] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9805b0bc-204b-40d6-ba08-9a5f26bf90b6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1030.991920] env[62277]: DEBUG nova.compute.provider_tree [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1031.003212] env[62277]: DEBUG nova.scheduler.client.report [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1031.027030] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.576s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1031.027551] env[62277]: ERROR nova.compute.manager [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1031.027551] env[62277]: Faults: ['InvalidArgument'] [ 1031.027551] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Traceback (most recent call last): [ 1031.027551] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1031.027551] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] self.driver.spawn(context, instance, image_meta, [ 1031.027551] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1031.027551] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1031.027551] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1031.027551] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] self._fetch_image_if_missing(context, vi) [ 1031.027551] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1031.027551] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] image_cache(vi, tmp_image_ds_loc) [ 1031.027551] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1031.028209] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] vm_util.copy_virtual_disk( [ 1031.028209] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1031.028209] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] session._wait_for_task(vmdk_copy_task) [ 1031.028209] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1031.028209] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] return self.wait_for_task(task_ref) [ 1031.028209] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1031.028209] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] return evt.wait() [ 1031.028209] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1031.028209] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] result = hub.switch() [ 1031.028209] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1031.028209] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] return self.greenlet.switch() [ 1031.028209] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1031.028209] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] self.f(*self.args, **self.kw) [ 1031.028586] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1031.028586] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] raise exceptions.translate_fault(task_info.error) [ 1031.028586] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1031.028586] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Faults: ['InvalidArgument'] [ 1031.028586] env[62277]: ERROR nova.compute.manager [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] [ 1031.028586] env[62277]: DEBUG nova.compute.utils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1031.039165] env[62277]: DEBUG nova.compute.manager [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Build of instance 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb was re-scheduled: A specified parameter was not correct: fileType [ 1031.039165] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1031.039165] env[62277]: DEBUG nova.compute.manager [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1031.039165] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Acquiring lock "refresh_cache-6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1031.039165] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Acquired lock "refresh_cache-6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1031.039390] env[62277]: DEBUG nova.network.neutron [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1031.118025] env[62277]: DEBUG nova.network.neutron [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1031.298118] env[62277]: DEBUG nova.network.neutron [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1031.310670] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Releasing lock "refresh_cache-6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1031.310898] env[62277]: DEBUG nova.compute.manager [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1031.311147] env[62277]: DEBUG nova.compute.manager [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] [instance: 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb] Skipping network deallocation for instance since networking was not requested. {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1031.461309] env[62277]: INFO nova.scheduler.client.report [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Deleted allocations for instance 6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb [ 1031.493806] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a5092740-02fe-46a6-8046-1c22d3c8b1a8 tempest-ServersAdmin275Test-915880979 tempest-ServersAdmin275Test-915880979-project-member] Lock "6ab5c776-0265-47cf-aa26-1d9b6ea3e1fb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 50.842s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1031.526979] env[62277]: DEBUG nova.compute.manager [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1031.599915] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1031.599915] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1031.600730] env[62277]: INFO nova.compute.claims [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1032.008407] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b532a337-82e9-46f0-a695-d63d7bd6af08 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1032.017978] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0abd8e3b-fe6d-4bcf-9443-b9a785a33989 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1032.065678] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3f82589-f1e1-4b4e-9d0b-6e4988ce8eb5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1032.073640] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-818930f5-5a91-4506-9ff1-035a395c3f5b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1032.090594] env[62277]: DEBUG nova.compute.provider_tree [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1032.099212] env[62277]: DEBUG nova.scheduler.client.report [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1032.116197] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.517s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1032.116704] env[62277]: DEBUG nova.compute.manager [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1032.168963] env[62277]: DEBUG nova.compute.utils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1032.170500] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1032.171065] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 1032.171065] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 1032.174553] env[62277]: DEBUG nova.compute.manager [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1032.174681] env[62277]: DEBUG nova.network.neutron [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1032.182184] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cda784f4-a56f-494b-9766-89b311c4a626 tempest-ServerActionsTestOtherA-1333146290 tempest-ServerActionsTestOtherA-1333146290-project-member] Acquiring lock "fb905507-549f-4265-9851-4a42930c02a4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1032.182446] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cda784f4-a56f-494b-9766-89b311c4a626 tempest-ServerActionsTestOtherA-1333146290 tempest-ServerActionsTestOtherA-1333146290-project-member] Lock "fb905507-549f-4265-9851-4a42930c02a4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1032.184213] env[62277]: DEBUG nova.compute.manager [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1032.213404] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1032.213404] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1032.213546] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1032.213580] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1032.215525] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1032.215525] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1032.215525] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1032.215525] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1032.215525] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1032.215774] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1032.215774] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 1032.281243] env[62277]: DEBUG nova.compute.manager [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1032.320982] env[62277]: DEBUG nova.virt.hardware [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1032.321851] env[62277]: DEBUG nova.virt.hardware [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1032.321851] env[62277]: DEBUG nova.virt.hardware [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1032.321998] env[62277]: DEBUG nova.virt.hardware [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1032.322132] env[62277]: DEBUG nova.virt.hardware [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1032.323937] env[62277]: DEBUG nova.virt.hardware [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1032.323937] env[62277]: DEBUG nova.virt.hardware [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1032.323937] env[62277]: DEBUG nova.virt.hardware [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1032.323937] env[62277]: DEBUG nova.virt.hardware [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1032.324247] env[62277]: DEBUG nova.virt.hardware [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1032.324247] env[62277]: DEBUG nova.virt.hardware [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1032.324247] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54c3ae01-cd6e-496b-8069-747466f2be2b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1032.333393] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-939ac731-e1ad-4a1a-b83a-82dede6d079e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1032.577942] env[62277]: DEBUG nova.policy [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '54fbb5090d0841f8ae6ed934a842191e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e7da6090138b4821be06acab5460942e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1033.171552] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1033.209182] env[62277]: DEBUG oslo_concurrency.lockutils [None req-17030804-65dc-46c0-96b3-152f53ed921a tempest-FloatingIPsAssociationTestJSON-809897368 tempest-FloatingIPsAssociationTestJSON-809897368-project-member] Acquiring lock "4f5018a5-26f5-4a31-8a9d-d2557c906995" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1033.209182] env[62277]: DEBUG oslo_concurrency.lockutils [None req-17030804-65dc-46c0-96b3-152f53ed921a tempest-FloatingIPsAssociationTestJSON-809897368 tempest-FloatingIPsAssociationTestJSON-809897368-project-member] Lock "4f5018a5-26f5-4a31-8a9d-d2557c906995" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1033.590431] env[62277]: DEBUG nova.network.neutron [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Successfully created port: ed495c04-4a8b-4554-a74c-7087a6458572 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1033.819677] env[62277]: DEBUG oslo_concurrency.lockutils [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Acquiring lock "c01ad807-a9a5-4028-baf4-0469a6301459" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1033.820517] env[62277]: DEBUG oslo_concurrency.lockutils [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Lock "c01ad807-a9a5-4028-baf4-0469a6301459" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1034.779939] env[62277]: DEBUG nova.network.neutron [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Successfully updated port: ed495c04-4a8b-4554-a74c-7087a6458572 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1034.795606] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Acquiring lock "refresh_cache-866c4415-caab-4d81-86ba-ed662feb3c4f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1034.797227] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Acquired lock "refresh_cache-866c4415-caab-4d81-86ba-ed662feb3c4f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1034.797227] env[62277]: DEBUG nova.network.neutron [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1034.849296] env[62277]: DEBUG nova.network.neutron [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1035.102818] env[62277]: DEBUG nova.network.neutron [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Updating instance_info_cache with network_info: [{"id": "ed495c04-4a8b-4554-a74c-7087a6458572", "address": "fa:16:3e:95:e4:ce", "network": {"id": "2c64e00f-4995-4827-b685-5096b1d1d064", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.74", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "63ed6630e9c140baa826f53d7a0564d1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "taped495c04-4a", "ovs_interfaceid": "ed495c04-4a8b-4554-a74c-7087a6458572", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1035.119573] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Releasing lock "refresh_cache-866c4415-caab-4d81-86ba-ed662feb3c4f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1035.120214] env[62277]: DEBUG nova.compute.manager [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Instance network_info: |[{"id": "ed495c04-4a8b-4554-a74c-7087a6458572", "address": "fa:16:3e:95:e4:ce", "network": {"id": "2c64e00f-4995-4827-b685-5096b1d1d064", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.74", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "63ed6630e9c140baa826f53d7a0564d1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "taped495c04-4a", "ovs_interfaceid": "ed495c04-4a8b-4554-a74c-7087a6458572", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1035.120584] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:95:e4:ce', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '77aa121f-8fb6-42f3-aaea-43addfe449b2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ed495c04-4a8b-4554-a74c-7087a6458572', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1035.130028] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Creating folder: Project (e7da6090138b4821be06acab5460942e). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1035.131761] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-64ca775f-fc34-4aa2-a789-b1c1b2d056d4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1035.143355] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Created folder: Project (e7da6090138b4821be06acab5460942e) in parent group-v297781. [ 1035.143826] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Creating folder: Instances. Parent ref: group-v297814. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1035.143826] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3b164870-7cc9-46d7-a70e-089a45a156f4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1035.152531] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Created folder: Instances in parent group-v297814. [ 1035.153435] env[62277]: DEBUG oslo.service.loopingcall [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1035.153435] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1035.153435] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-78809105-e944-42b6-b742-01d2b329f149 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1035.171905] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1035.172324] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1035.173192] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1035.178035] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1035.178035] env[62277]: value = "task-1405327" [ 1035.178035] env[62277]: _type = "Task" [ 1035.178035] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1035.188379] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405327, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1035.542982] env[62277]: DEBUG nova.compute.manager [req-e056f873-0d33-4257-bfcf-4867dd8b4849 req-0a1418d8-3f2d-4852-9faa-b8c1cfaee58b service nova] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Received event network-vif-plugged-ed495c04-4a8b-4554-a74c-7087a6458572 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1035.543319] env[62277]: DEBUG oslo_concurrency.lockutils [req-e056f873-0d33-4257-bfcf-4867dd8b4849 req-0a1418d8-3f2d-4852-9faa-b8c1cfaee58b service nova] Acquiring lock "866c4415-caab-4d81-86ba-ed662feb3c4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1035.544466] env[62277]: DEBUG oslo_concurrency.lockutils [req-e056f873-0d33-4257-bfcf-4867dd8b4849 req-0a1418d8-3f2d-4852-9faa-b8c1cfaee58b service nova] Lock "866c4415-caab-4d81-86ba-ed662feb3c4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1035.544466] env[62277]: DEBUG oslo_concurrency.lockutils [req-e056f873-0d33-4257-bfcf-4867dd8b4849 req-0a1418d8-3f2d-4852-9faa-b8c1cfaee58b service nova] Lock "866c4415-caab-4d81-86ba-ed662feb3c4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1035.544466] env[62277]: DEBUG nova.compute.manager [req-e056f873-0d33-4257-bfcf-4867dd8b4849 req-0a1418d8-3f2d-4852-9faa-b8c1cfaee58b service nova] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] No waiting events found dispatching network-vif-plugged-ed495c04-4a8b-4554-a74c-7087a6458572 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1035.544466] env[62277]: WARNING nova.compute.manager [req-e056f873-0d33-4257-bfcf-4867dd8b4849 req-0a1418d8-3f2d-4852-9faa-b8c1cfaee58b service nova] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Received unexpected event network-vif-plugged-ed495c04-4a8b-4554-a74c-7087a6458572 for instance with vm_state building and task_state spawning. [ 1035.692759] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405327, 'name': CreateVM_Task, 'duration_secs': 0.319119} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1035.692958] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1035.693595] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1035.693751] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1035.695875] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1035.697828] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-31c13a1e-a695-42be-95ab-250c1203eafa {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1035.704199] env[62277]: DEBUG oslo_vmware.api [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Waiting for the task: (returnval){ [ 1035.704199] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]520a15eb-f379-71b2-a73c-55d4b422a050" [ 1035.704199] env[62277]: _type = "Task" [ 1035.704199] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1035.712701] env[62277]: DEBUG oslo_vmware.api [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]520a15eb-f379-71b2-a73c-55d4b422a050, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1036.218820] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1036.219549] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1036.219549] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1037.169447] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1037.169447] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1037.184082] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1037.184359] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1037.184473] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1037.184626] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1037.185753] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-142560a0-52f7-4431-8d4c-fffc2a55bc7b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1037.195730] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ebdc6b1-ade0-41fd-99cc-e76d9055b847 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1037.211435] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8573531c-4735-4b72-94a0-d47e9be75b7b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1037.218431] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-958da4e5-6691-4a84-9698-2327ebb4e771 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1037.255110] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181356MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1037.255110] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1037.255110] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1037.376877] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 1829c328-2a68-4297-b80e-afb0e898ba72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1037.376877] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 3d260cd8-ab21-4e1e-8891-6f216350a587 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1037.378213] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance d68ccb50-a04d-4e59-8161-f01305eb81a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1037.378213] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 19d15611-315f-4c4b-8f32-e5d00d0d8ca8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1037.378213] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance dfc291fd-1481-4e76-9fb3-ec87124c1281 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1037.378213] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 36ff1435-1999-4e95-8920-81a1b25cc452 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1037.378401] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 68925f1b-da69-4955-acb1-d6500b03daee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1037.378401] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 3862bece-d65f-4a89-b9fa-262ea01d10b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1037.378401] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 43255fee-e5da-4fe0-8fa7-4aba7592745b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1037.378401] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 866c4415-caab-4d81-86ba-ed662feb3c4f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1037.412733] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 350e2302-66b9-4dd6-b0f4-77000992408b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1037.440353] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 23bc5a48-3e96-4897-bf28-ad14a0bdde62 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1037.453490] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 346748bd-b4e8-4e93-b71d-66c90a45e372 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1037.468073] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 930ff058-ab48-4c8a-8f5e-4820a3b12d50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1037.487532] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 154fa64c-55d4-4b72-8af9-39e72fd5df5f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1037.503828] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1037.525148] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 79177b7a-1bf6-4649-80c7-4ba2c6cda0ad has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1037.540226] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4cc3443d-3d05-4d3b-a222-6b7367c1c989 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1037.557724] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 04122dba-4cbf-4176-95bd-f1c4bfaa799e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1037.573628] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 30d7d279-7241-4f2e-b963-d205a5f9fa41 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1037.591155] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance fb905507-549f-4265-9851-4a42930c02a4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1037.611127] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4f5018a5-26f5-4a31-8a9d-d2557c906995 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1037.626425] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c01ad807-a9a5-4028-baf4-0469a6301459 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1037.626697] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1037.626857] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1038.134950] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6757889-210c-4182-bd3c-212bb1dc0434 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1038.150945] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63fff095-aa87-400c-afcb-580075c8ef32 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1038.194107] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b3dab6a-be36-4d9f-8025-43fe91d77a48 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1038.207962] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee852012-62fc-4cb8-b83d-34f03c42c087 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1038.222552] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1038.243164] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1038.264066] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1038.264344] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.011s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1038.782764] env[62277]: DEBUG nova.compute.manager [req-5605fd95-7b51-4cdb-8125-7d511c0e5470 req-ee6da8fb-4119-4b63-aa38-cb20a29d00d8 service nova] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Received event network-changed-ed495c04-4a8b-4554-a74c-7087a6458572 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1038.782764] env[62277]: DEBUG nova.compute.manager [req-5605fd95-7b51-4cdb-8125-7d511c0e5470 req-ee6da8fb-4119-4b63-aa38-cb20a29d00d8 service nova] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Refreshing instance network info cache due to event network-changed-ed495c04-4a8b-4554-a74c-7087a6458572. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1038.782764] env[62277]: DEBUG oslo_concurrency.lockutils [req-5605fd95-7b51-4cdb-8125-7d511c0e5470 req-ee6da8fb-4119-4b63-aa38-cb20a29d00d8 service nova] Acquiring lock "refresh_cache-866c4415-caab-4d81-86ba-ed662feb3c4f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1038.782764] env[62277]: DEBUG oslo_concurrency.lockutils [req-5605fd95-7b51-4cdb-8125-7d511c0e5470 req-ee6da8fb-4119-4b63-aa38-cb20a29d00d8 service nova] Acquired lock "refresh_cache-866c4415-caab-4d81-86ba-ed662feb3c4f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1038.782990] env[62277]: DEBUG nova.network.neutron [req-5605fd95-7b51-4cdb-8125-7d511c0e5470 req-ee6da8fb-4119-4b63-aa38-cb20a29d00d8 service nova] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Refreshing network info cache for port ed495c04-4a8b-4554-a74c-7087a6458572 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1039.264494] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1039.548954] env[62277]: DEBUG nova.network.neutron [req-5605fd95-7b51-4cdb-8125-7d511c0e5470 req-ee6da8fb-4119-4b63-aa38-cb20a29d00d8 service nova] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Updated VIF entry in instance network info cache for port ed495c04-4a8b-4554-a74c-7087a6458572. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1039.549609] env[62277]: DEBUG nova.network.neutron [req-5605fd95-7b51-4cdb-8125-7d511c0e5470 req-ee6da8fb-4119-4b63-aa38-cb20a29d00d8 service nova] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Updating instance_info_cache with network_info: [{"id": "ed495c04-4a8b-4554-a74c-7087a6458572", "address": "fa:16:3e:95:e4:ce", "network": {"id": "2c64e00f-4995-4827-b685-5096b1d1d064", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.74", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "63ed6630e9c140baa826f53d7a0564d1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "taped495c04-4a", "ovs_interfaceid": "ed495c04-4a8b-4554-a74c-7087a6458572", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1039.567265] env[62277]: DEBUG oslo_concurrency.lockutils [req-5605fd95-7b51-4cdb-8125-7d511c0e5470 req-ee6da8fb-4119-4b63-aa38-cb20a29d00d8 service nova] Releasing lock "refresh_cache-866c4415-caab-4d81-86ba-ed662feb3c4f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1040.424325] env[62277]: DEBUG oslo_concurrency.lockutils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Acquiring lock "1e8429b2-7149-4832-8590-e0ebd8501176" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1040.424608] env[62277]: DEBUG oslo_concurrency.lockutils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Lock "1e8429b2-7149-4832-8590-e0ebd8501176" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1041.169145] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1041.169145] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 1042.200227] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8efded13-d761-4957-905d-83cb75b63cfd tempest-ServerDiagnosticsTest-825586886 tempest-ServerDiagnosticsTest-825586886-project-member] Acquiring lock "f646a534-1ae8-40dd-9819-3d71bda87ae2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1042.201106] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8efded13-d761-4957-905d-83cb75b63cfd tempest-ServerDiagnosticsTest-825586886 tempest-ServerDiagnosticsTest-825586886-project-member] Lock "f646a534-1ae8-40dd-9819-3d71bda87ae2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1042.399143] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be1c341e-2f32-49d0-9112-018733d49685 tempest-InstanceActionsV221TestJSON-794172250 tempest-InstanceActionsV221TestJSON-794172250-project-member] Acquiring lock "2b98866e-3c86-47bd-9eff-2c2743631563" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1042.399461] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be1c341e-2f32-49d0-9112-018733d49685 tempest-InstanceActionsV221TestJSON-794172250 tempest-InstanceActionsV221TestJSON-794172250-project-member] Lock "2b98866e-3c86-47bd-9eff-2c2743631563" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1043.151746] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5c153d52-0ee9-437c-910f-1755e96827aa tempest-ServerShowV254Test-1304854974 tempest-ServerShowV254Test-1304854974-project-member] Acquiring lock "560b7750-03fe-4a4c-ab1d-a1751895986b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1043.152322] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5c153d52-0ee9-437c-910f-1755e96827aa tempest-ServerShowV254Test-1304854974 tempest-ServerShowV254Test-1304854974-project-member] Lock "560b7750-03fe-4a4c-ab1d-a1751895986b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1044.417191] env[62277]: DEBUG oslo_concurrency.lockutils [None req-286b155a-78f0-4f14-9ee9-d6607e7a609d tempest-ImagesNegativeTestJSON-1671068824 tempest-ImagesNegativeTestJSON-1671068824-project-member] Acquiring lock "86e8e8ba-e476-400d-b180-bb7df8a042d8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1044.417570] env[62277]: DEBUG oslo_concurrency.lockutils [None req-286b155a-78f0-4f14-9ee9-d6607e7a609d tempest-ImagesNegativeTestJSON-1671068824 tempest-ImagesNegativeTestJSON-1671068824-project-member] Lock "86e8e8ba-e476-400d-b180-bb7df8a042d8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1052.043033] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ad261aee-be38-472d-a103-72943d529497 tempest-ServersNegativeTestMultiTenantJSON-169553101 tempest-ServersNegativeTestMultiTenantJSON-169553101-project-member] Acquiring lock "c3e72352-f795-4ce7-9e0b-4e80c4329f7b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1052.043033] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ad261aee-be38-472d-a103-72943d529497 tempest-ServersNegativeTestMultiTenantJSON-169553101 tempest-ServersNegativeTestMultiTenantJSON-169553101-project-member] Lock "c3e72352-f795-4ce7-9e0b-4e80c4329f7b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1057.493737] env[62277]: WARNING oslo_vmware.rw_handles [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1057.493737] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1057.493737] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1057.493737] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1057.493737] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1057.493737] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1057.493737] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1057.493737] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1057.493737] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1057.493737] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1057.493737] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1057.493737] env[62277]: ERROR oslo_vmware.rw_handles [ 1057.494332] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/2c2ab30f-13c6-447b-88b9-42754657c9fa/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore1 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1057.499286] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1057.499538] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Copying Virtual Disk [datastore1] vmware_temp/2c2ab30f-13c6-447b-88b9-42754657c9fa/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore1] vmware_temp/2c2ab30f-13c6-447b-88b9-42754657c9fa/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1057.499822] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ee128d54-76c3-4c7b-be31-c23dea3f062b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.507946] env[62277]: DEBUG oslo_vmware.api [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Waiting for the task: (returnval){ [ 1057.507946] env[62277]: value = "task-1405328" [ 1057.507946] env[62277]: _type = "Task" [ 1057.507946] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1057.516922] env[62277]: DEBUG oslo_vmware.api [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Task: {'id': task-1405328, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1058.019718] env[62277]: DEBUG oslo_vmware.exceptions [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1058.020009] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1058.020565] env[62277]: ERROR nova.compute.manager [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1058.020565] env[62277]: Faults: ['InvalidArgument'] [ 1058.020565] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Traceback (most recent call last): [ 1058.020565] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1058.020565] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] yield resources [ 1058.020565] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1058.020565] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] self.driver.spawn(context, instance, image_meta, [ 1058.020565] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1058.020565] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1058.020565] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1058.020565] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] self._fetch_image_if_missing(context, vi) [ 1058.020565] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1058.020953] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] image_cache(vi, tmp_image_ds_loc) [ 1058.020953] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1058.020953] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] vm_util.copy_virtual_disk( [ 1058.020953] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1058.020953] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] session._wait_for_task(vmdk_copy_task) [ 1058.020953] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1058.020953] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] return self.wait_for_task(task_ref) [ 1058.020953] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1058.020953] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] return evt.wait() [ 1058.020953] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1058.020953] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] result = hub.switch() [ 1058.020953] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1058.020953] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] return self.greenlet.switch() [ 1058.021361] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1058.021361] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] self.f(*self.args, **self.kw) [ 1058.021361] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1058.021361] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] raise exceptions.translate_fault(task_info.error) [ 1058.021361] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1058.021361] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Faults: ['InvalidArgument'] [ 1058.021361] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] [ 1058.021361] env[62277]: INFO nova.compute.manager [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Terminating instance [ 1058.022571] env[62277]: DEBUG oslo_concurrency.lockutils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1058.022773] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1058.023575] env[62277]: DEBUG nova.compute.manager [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1058.023771] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1058.023993] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c8bf10d3-aa74-4a88-8831-21f09e175080 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.026817] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e9f4104-b5fa-49e1-a2a9-a73208e72c81 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.034200] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1058.034437] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2eab0a1c-c972-4529-9b50-65a53d55ec77 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.042383] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1058.042561] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1058.043281] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-070ade1c-5a41-4095-ab49-fafbe5305d74 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.048058] env[62277]: DEBUG oslo_vmware.api [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Waiting for the task: (returnval){ [ 1058.048058] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52152822-dda5-ba2f-2006-4e70da3dc40d" [ 1058.048058] env[62277]: _type = "Task" [ 1058.048058] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1058.055947] env[62277]: DEBUG oslo_vmware.api [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52152822-dda5-ba2f-2006-4e70da3dc40d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1058.105865] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1058.106155] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Deleting contents of the VM from datastore datastore1 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1058.106406] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Deleting the datastore file [datastore1] 3862bece-d65f-4a89-b9fa-262ea01d10b9 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1058.106730] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5a60f85e-ff6d-4334-9c6e-92d5cd94d87d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.114725] env[62277]: DEBUG oslo_vmware.api [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Waiting for the task: (returnval){ [ 1058.114725] env[62277]: value = "task-1405330" [ 1058.114725] env[62277]: _type = "Task" [ 1058.114725] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1058.124926] env[62277]: DEBUG oslo_vmware.api [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Task: {'id': task-1405330, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1058.562018] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1058.562018] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Creating directory with path [datastore1] vmware_temp/b7191af3-61e4-4c8b-b087-6ce0f3f50e47/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1058.562018] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7f98b301-47ec-4fc6-b971-3acdd7ae169c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.573170] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Created directory with path [datastore1] vmware_temp/b7191af3-61e4-4c8b-b087-6ce0f3f50e47/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1058.573383] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Fetch image to [datastore1] vmware_temp/b7191af3-61e4-4c8b-b087-6ce0f3f50e47/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1058.573554] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore1] vmware_temp/b7191af3-61e4-4c8b-b087-6ce0f3f50e47/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore1 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1058.574340] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c906f76-e190-48d5-b054-2248458e9507 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.581790] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e42fd3ab-4322-4eab-8276-0683292e3c3c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.592317] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35d5514a-3030-407c-b8c2-bddf6ad4ff27 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.629681] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29d61684-55ad-4ee8-a2d0-1dff014343dd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.639233] env[62277]: DEBUG oslo_vmware.api [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Task: {'id': task-1405330, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.113318} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1058.640085] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1058.640085] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Deleted contents of the VM from datastore datastore1 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1058.640085] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1058.640085] env[62277]: INFO nova.compute.manager [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1058.642643] env[62277]: DEBUG nova.compute.claims [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1058.642864] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1058.643141] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1058.646993] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-55ef66ec-100d-4d91-b244-fae0280d62ab {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1058.736705] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore1 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1058.817876] env[62277]: DEBUG oslo_vmware.rw_handles [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b7191af3-61e4-4c8b-b087-6ce0f3f50e47/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1058.879875] env[62277]: DEBUG oslo_vmware.rw_handles [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1058.880043] env[62277]: DEBUG oslo_vmware.rw_handles [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b7191af3-61e4-4c8b-b087-6ce0f3f50e47/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1059.183081] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8bee68c-3f34-4c26-b448-0bfa7a1695f8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.190737] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0adc9f8-9268-4ed2-9af9-7bec238bbe3e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.219915] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea78365e-0d63-4b27-aca4-f13e939102c1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.227453] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-287dd84a-356f-4825-83fe-4c4a6a506a7c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1059.244202] env[62277]: DEBUG nova.compute.provider_tree [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1059.255340] env[62277]: DEBUG nova.scheduler.client.report [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1059.272254] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.629s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1059.272790] env[62277]: ERROR nova.compute.manager [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1059.272790] env[62277]: Faults: ['InvalidArgument'] [ 1059.272790] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Traceback (most recent call last): [ 1059.272790] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1059.272790] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] self.driver.spawn(context, instance, image_meta, [ 1059.272790] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1059.272790] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1059.272790] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1059.272790] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] self._fetch_image_if_missing(context, vi) [ 1059.272790] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1059.272790] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] image_cache(vi, tmp_image_ds_loc) [ 1059.272790] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1059.273452] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] vm_util.copy_virtual_disk( [ 1059.273452] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1059.273452] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] session._wait_for_task(vmdk_copy_task) [ 1059.273452] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1059.273452] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] return self.wait_for_task(task_ref) [ 1059.273452] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1059.273452] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] return evt.wait() [ 1059.273452] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1059.273452] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] result = hub.switch() [ 1059.273452] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1059.273452] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] return self.greenlet.switch() [ 1059.273452] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1059.273452] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] self.f(*self.args, **self.kw) [ 1059.273844] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1059.273844] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] raise exceptions.translate_fault(task_info.error) [ 1059.273844] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1059.273844] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Faults: ['InvalidArgument'] [ 1059.273844] env[62277]: ERROR nova.compute.manager [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] [ 1059.273844] env[62277]: DEBUG nova.compute.utils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1059.275255] env[62277]: DEBUG nova.compute.manager [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Build of instance 3862bece-d65f-4a89-b9fa-262ea01d10b9 was re-scheduled: A specified parameter was not correct: fileType [ 1059.275255] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1059.276034] env[62277]: DEBUG nova.compute.manager [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1059.276034] env[62277]: DEBUG nova.compute.manager [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1059.276034] env[62277]: DEBUG nova.compute.manager [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1059.276034] env[62277]: DEBUG nova.network.neutron [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1059.560056] env[62277]: DEBUG oslo_concurrency.lockutils [None req-832722ed-7844-4d20-8fab-7929c1c0c2ff tempest-ServerActionsTestOtherB-254108444 tempest-ServerActionsTestOtherB-254108444-project-member] Acquiring lock "b6908c32-5916-4a0e-92e2-21f480c5f7ca" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1059.560302] env[62277]: DEBUG oslo_concurrency.lockutils [None req-832722ed-7844-4d20-8fab-7929c1c0c2ff tempest-ServerActionsTestOtherB-254108444 tempest-ServerActionsTestOtherB-254108444-project-member] Lock "b6908c32-5916-4a0e-92e2-21f480c5f7ca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1059.691379] env[62277]: DEBUG nova.network.neutron [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1059.709141] env[62277]: INFO nova.compute.manager [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] [instance: 3862bece-d65f-4a89-b9fa-262ea01d10b9] Took 0.43 seconds to deallocate network for instance. [ 1059.809901] env[62277]: INFO nova.scheduler.client.report [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Deleted allocations for instance 3862bece-d65f-4a89-b9fa-262ea01d10b9 [ 1059.830495] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be6dbfd7-e13e-4a5e-ad7f-04c9fb7ea05c tempest-AttachInterfacesUnderV243Test-476296311 tempest-AttachInterfacesUnderV243Test-476296311-project-member] Lock "3862bece-d65f-4a89-b9fa-262ea01d10b9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 54.782s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1059.859468] env[62277]: DEBUG nova.compute.manager [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1059.918441] env[62277]: DEBUG oslo_concurrency.lockutils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1059.918692] env[62277]: DEBUG oslo_concurrency.lockutils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1059.920204] env[62277]: INFO nova.compute.claims [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1060.358820] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46cf83dc-3d03-42e1-b43c-6ac3a842c79d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.366499] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e093496-9f0b-4ba0-8869-8f7281be4d5d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.396976] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8463015-f3c3-44a4-85e5-7e93930da35d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.404791] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1204e455-f00a-48fb-9aff-919c14bf7108 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.417828] env[62277]: DEBUG nova.compute.provider_tree [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1060.426386] env[62277]: DEBUG nova.scheduler.client.report [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1060.441849] env[62277]: DEBUG oslo_concurrency.lockutils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.523s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1060.442355] env[62277]: DEBUG nova.compute.manager [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1060.480093] env[62277]: DEBUG nova.compute.utils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1060.481386] env[62277]: DEBUG nova.compute.manager [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1060.481623] env[62277]: DEBUG nova.network.neutron [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1060.490136] env[62277]: DEBUG nova.compute.manager [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1060.540012] env[62277]: DEBUG nova.policy [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '22a8cdd3f08c494ba03339764f48cec9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '32003a0dfe004ada8d788df7a65c0a0f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1060.570950] env[62277]: DEBUG nova.compute.manager [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1060.604991] env[62277]: DEBUG nova.virt.hardware [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1060.605247] env[62277]: DEBUG nova.virt.hardware [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1060.605477] env[62277]: DEBUG nova.virt.hardware [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1060.605815] env[62277]: DEBUG nova.virt.hardware [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1060.605912] env[62277]: DEBUG nova.virt.hardware [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1060.606108] env[62277]: DEBUG nova.virt.hardware [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1060.606351] env[62277]: DEBUG nova.virt.hardware [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1060.606552] env[62277]: DEBUG nova.virt.hardware [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1060.606754] env[62277]: DEBUG nova.virt.hardware [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1060.606944] env[62277]: DEBUG nova.virt.hardware [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1060.607231] env[62277]: DEBUG nova.virt.hardware [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1060.608357] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf64eb29-884c-4869-8f9d-b244abd2cb6b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.616907] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea6a0738-aeab-4b83-a045-bad9adb2f028 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1060.922316] env[62277]: DEBUG nova.network.neutron [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Successfully created port: 488d4dda-ce74-47a7-9621-15d4fa8a7bf2 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1061.968269] env[62277]: DEBUG nova.network.neutron [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Successfully updated port: 488d4dda-ce74-47a7-9621-15d4fa8a7bf2 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1061.988819] env[62277]: DEBUG oslo_concurrency.lockutils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Acquiring lock "refresh_cache-350e2302-66b9-4dd6-b0f4-77000992408b" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1061.989901] env[62277]: DEBUG oslo_concurrency.lockutils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Acquired lock "refresh_cache-350e2302-66b9-4dd6-b0f4-77000992408b" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1061.989901] env[62277]: DEBUG nova.network.neutron [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1062.045840] env[62277]: DEBUG nova.network.neutron [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1062.284370] env[62277]: DEBUG nova.network.neutron [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Updating instance_info_cache with network_info: [{"id": "488d4dda-ce74-47a7-9621-15d4fa8a7bf2", "address": "fa:16:3e:1e:1b:2e", "network": {"id": "fba9a62e-8cae-464e-bcac-1496a2bdebec", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1794903657-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "32003a0dfe004ada8d788df7a65c0a0f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "517421c3-bea0-419c-ab0b-987815e5d160", "external-id": "nsx-vlan-transportzone-68", "segmentation_id": 68, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap488d4dda-ce", "ovs_interfaceid": "488d4dda-ce74-47a7-9621-15d4fa8a7bf2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1062.297138] env[62277]: DEBUG oslo_concurrency.lockutils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Releasing lock "refresh_cache-350e2302-66b9-4dd6-b0f4-77000992408b" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1062.297436] env[62277]: DEBUG nova.compute.manager [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Instance network_info: |[{"id": "488d4dda-ce74-47a7-9621-15d4fa8a7bf2", "address": "fa:16:3e:1e:1b:2e", "network": {"id": "fba9a62e-8cae-464e-bcac-1496a2bdebec", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1794903657-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "32003a0dfe004ada8d788df7a65c0a0f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "517421c3-bea0-419c-ab0b-987815e5d160", "external-id": "nsx-vlan-transportzone-68", "segmentation_id": 68, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap488d4dda-ce", "ovs_interfaceid": "488d4dda-ce74-47a7-9621-15d4fa8a7bf2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1062.297822] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1e:1b:2e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '517421c3-bea0-419c-ab0b-987815e5d160', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '488d4dda-ce74-47a7-9621-15d4fa8a7bf2', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1062.312220] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Creating folder: Project (32003a0dfe004ada8d788df7a65c0a0f). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1062.312220] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7e796a69-c4f8-4247-ab70-7d52f581f385 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.315997] env[62277]: DEBUG nova.compute.manager [req-3eda95d2-ca86-4507-a594-d7a5c4c4dccd req-5c6f3539-2428-45b4-a095-6bd80d9626c5 service nova] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Received event network-vif-plugged-488d4dda-ce74-47a7-9621-15d4fa8a7bf2 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1062.316209] env[62277]: DEBUG oslo_concurrency.lockutils [req-3eda95d2-ca86-4507-a594-d7a5c4c4dccd req-5c6f3539-2428-45b4-a095-6bd80d9626c5 service nova] Acquiring lock "350e2302-66b9-4dd6-b0f4-77000992408b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1062.316579] env[62277]: DEBUG oslo_concurrency.lockutils [req-3eda95d2-ca86-4507-a594-d7a5c4c4dccd req-5c6f3539-2428-45b4-a095-6bd80d9626c5 service nova] Lock "350e2302-66b9-4dd6-b0f4-77000992408b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1062.316767] env[62277]: DEBUG oslo_concurrency.lockutils [req-3eda95d2-ca86-4507-a594-d7a5c4c4dccd req-5c6f3539-2428-45b4-a095-6bd80d9626c5 service nova] Lock "350e2302-66b9-4dd6-b0f4-77000992408b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1062.316929] env[62277]: DEBUG nova.compute.manager [req-3eda95d2-ca86-4507-a594-d7a5c4c4dccd req-5c6f3539-2428-45b4-a095-6bd80d9626c5 service nova] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] No waiting events found dispatching network-vif-plugged-488d4dda-ce74-47a7-9621-15d4fa8a7bf2 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1062.317098] env[62277]: WARNING nova.compute.manager [req-3eda95d2-ca86-4507-a594-d7a5c4c4dccd req-5c6f3539-2428-45b4-a095-6bd80d9626c5 service nova] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Received unexpected event network-vif-plugged-488d4dda-ce74-47a7-9621-15d4fa8a7bf2 for instance with vm_state building and task_state spawning. [ 1062.326405] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Created folder: Project (32003a0dfe004ada8d788df7a65c0a0f) in parent group-v297781. [ 1062.326588] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Creating folder: Instances. Parent ref: group-v297817. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1062.326812] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-67be65db-254e-4dbb-a095-78e23d0c9ff1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.338016] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Created folder: Instances in parent group-v297817. [ 1062.338016] env[62277]: DEBUG oslo.service.loopingcall [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1062.338016] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1062.338016] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-982fcfe8-d72f-485a-a793-731c00bfc667 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.357282] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1062.357282] env[62277]: value = "task-1405333" [ 1062.357282] env[62277]: _type = "Task" [ 1062.357282] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1062.364154] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405333, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1062.868837] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405333, 'name': CreateVM_Task, 'duration_secs': 0.302712} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1062.869120] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1062.869957] env[62277]: DEBUG oslo_concurrency.lockutils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1062.870143] env[62277]: DEBUG oslo_concurrency.lockutils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1062.870461] env[62277]: DEBUG oslo_concurrency.lockutils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1062.870723] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-977297ed-cb52-4058-92e5-99fee0adef6f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1062.875997] env[62277]: DEBUG oslo_vmware.api [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Waiting for the task: (returnval){ [ 1062.875997] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]524d2350-0f3f-5a00-5702-9bcb84382f61" [ 1062.875997] env[62277]: _type = "Task" [ 1062.875997] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1062.884371] env[62277]: DEBUG oslo_vmware.api [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]524d2350-0f3f-5a00-5702-9bcb84382f61, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1063.389972] env[62277]: DEBUG oslo_concurrency.lockutils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1063.390274] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1063.390600] env[62277]: DEBUG oslo_concurrency.lockutils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1064.335340] env[62277]: DEBUG nova.compute.manager [req-898208d8-9c83-4cc8-8feb-488632ad2780 req-c337e9e3-59df-4158-81e5-e0a4d28e25e2 service nova] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Received event network-changed-488d4dda-ce74-47a7-9621-15d4fa8a7bf2 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1064.335340] env[62277]: DEBUG nova.compute.manager [req-898208d8-9c83-4cc8-8feb-488632ad2780 req-c337e9e3-59df-4158-81e5-e0a4d28e25e2 service nova] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Refreshing instance network info cache due to event network-changed-488d4dda-ce74-47a7-9621-15d4fa8a7bf2. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1064.335512] env[62277]: DEBUG oslo_concurrency.lockutils [req-898208d8-9c83-4cc8-8feb-488632ad2780 req-c337e9e3-59df-4158-81e5-e0a4d28e25e2 service nova] Acquiring lock "refresh_cache-350e2302-66b9-4dd6-b0f4-77000992408b" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1064.335652] env[62277]: DEBUG oslo_concurrency.lockutils [req-898208d8-9c83-4cc8-8feb-488632ad2780 req-c337e9e3-59df-4158-81e5-e0a4d28e25e2 service nova] Acquired lock "refresh_cache-350e2302-66b9-4dd6-b0f4-77000992408b" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1064.335809] env[62277]: DEBUG nova.network.neutron [req-898208d8-9c83-4cc8-8feb-488632ad2780 req-c337e9e3-59df-4158-81e5-e0a4d28e25e2 service nova] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Refreshing network info cache for port 488d4dda-ce74-47a7-9621-15d4fa8a7bf2 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1064.782077] env[62277]: DEBUG nova.network.neutron [req-898208d8-9c83-4cc8-8feb-488632ad2780 req-c337e9e3-59df-4158-81e5-e0a4d28e25e2 service nova] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Updated VIF entry in instance network info cache for port 488d4dda-ce74-47a7-9621-15d4fa8a7bf2. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1064.782490] env[62277]: DEBUG nova.network.neutron [req-898208d8-9c83-4cc8-8feb-488632ad2780 req-c337e9e3-59df-4158-81e5-e0a4d28e25e2 service nova] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Updating instance_info_cache with network_info: [{"id": "488d4dda-ce74-47a7-9621-15d4fa8a7bf2", "address": "fa:16:3e:1e:1b:2e", "network": {"id": "fba9a62e-8cae-464e-bcac-1496a2bdebec", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1794903657-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "32003a0dfe004ada8d788df7a65c0a0f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "517421c3-bea0-419c-ab0b-987815e5d160", "external-id": "nsx-vlan-transportzone-68", "segmentation_id": 68, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap488d4dda-ce", "ovs_interfaceid": "488d4dda-ce74-47a7-9621-15d4fa8a7bf2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1064.792230] env[62277]: DEBUG oslo_concurrency.lockutils [req-898208d8-9c83-4cc8-8feb-488632ad2780 req-c337e9e3-59df-4158-81e5-e0a4d28e25e2 service nova] Releasing lock "refresh_cache-350e2302-66b9-4dd6-b0f4-77000992408b" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1068.398118] env[62277]: DEBUG oslo_concurrency.lockutils [None req-267d2afe-de5d-4805-b2e9-11d8f2b57316 tempest-InstanceActionsNegativeTestJSON-286532833 tempest-InstanceActionsNegativeTestJSON-286532833-project-member] Acquiring lock "27cd13ca-a17c-476e-a00a-cca1fe898763" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1068.399022] env[62277]: DEBUG oslo_concurrency.lockutils [None req-267d2afe-de5d-4805-b2e9-11d8f2b57316 tempest-InstanceActionsNegativeTestJSON-286532833 tempest-InstanceActionsNegativeTestJSON-286532833-project-member] Lock "27cd13ca-a17c-476e-a00a-cca1fe898763" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1079.115462] env[62277]: WARNING oslo_vmware.rw_handles [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1079.115462] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1079.115462] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1079.115462] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1079.115462] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1079.115462] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1079.115462] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1079.115462] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1079.115462] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1079.115462] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1079.115462] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1079.115462] env[62277]: ERROR oslo_vmware.rw_handles [ 1079.115957] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/7b9d2a9f-81cb-4e3d-a4af-c0c56bc1b793/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1079.117532] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1079.117832] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Copying Virtual Disk [datastore2] vmware_temp/7b9d2a9f-81cb-4e3d-a4af-c0c56bc1b793/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/7b9d2a9f-81cb-4e3d-a4af-c0c56bc1b793/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1079.118165] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7885d6d7-7c1c-468d-b4e2-8f672aef895e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.126392] env[62277]: DEBUG oslo_vmware.api [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Waiting for the task: (returnval){ [ 1079.126392] env[62277]: value = "task-1405334" [ 1079.126392] env[62277]: _type = "Task" [ 1079.126392] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1079.135451] env[62277]: DEBUG oslo_vmware.api [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Task: {'id': task-1405334, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1079.638151] env[62277]: DEBUG oslo_vmware.exceptions [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1079.638445] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1079.639021] env[62277]: ERROR nova.compute.manager [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1079.639021] env[62277]: Faults: ['InvalidArgument'] [ 1079.639021] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Traceback (most recent call last): [ 1079.639021] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1079.639021] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] yield resources [ 1079.639021] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1079.639021] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] self.driver.spawn(context, instance, image_meta, [ 1079.639021] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1079.639021] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1079.639021] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1079.639021] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] self._fetch_image_if_missing(context, vi) [ 1079.639021] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1079.639285] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] image_cache(vi, tmp_image_ds_loc) [ 1079.639285] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1079.639285] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] vm_util.copy_virtual_disk( [ 1079.639285] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1079.639285] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] session._wait_for_task(vmdk_copy_task) [ 1079.639285] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1079.639285] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] return self.wait_for_task(task_ref) [ 1079.639285] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1079.639285] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] return evt.wait() [ 1079.639285] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1079.639285] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] result = hub.switch() [ 1079.639285] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1079.639285] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] return self.greenlet.switch() [ 1079.639562] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1079.639562] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] self.f(*self.args, **self.kw) [ 1079.639562] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1079.639562] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] raise exceptions.translate_fault(task_info.error) [ 1079.639562] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1079.639562] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Faults: ['InvalidArgument'] [ 1079.639562] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] [ 1079.639562] env[62277]: INFO nova.compute.manager [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Terminating instance [ 1079.640940] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1079.641147] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1079.641820] env[62277]: DEBUG nova.compute.manager [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1079.642011] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1079.642240] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cbaaefe5-2597-4ea4-95ee-36b217a50214 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.644948] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff462dbc-ce13-422e-a7b7-e8bb629e6bb6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.651986] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1079.652218] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3ac5b012-a48f-4f1b-8b27-6c15cea23e8d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.654470] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1079.654638] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1079.655562] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9e77248d-efa8-4030-842c-35e55aee9a84 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.660313] env[62277]: DEBUG oslo_vmware.api [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Waiting for the task: (returnval){ [ 1079.660313] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52643c32-4c77-a090-2c15-fe7d48acf81f" [ 1079.660313] env[62277]: _type = "Task" [ 1079.660313] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1079.676957] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1079.677382] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Creating directory with path [datastore2] vmware_temp/5afe04d8-0f27-4994-b468-43620ffd1bb0/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1079.677754] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c14fd12d-038a-455c-b8e7-2882b4ef8c87 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.690433] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Created directory with path [datastore2] vmware_temp/5afe04d8-0f27-4994-b468-43620ffd1bb0/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1079.690607] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Fetch image to [datastore2] vmware_temp/5afe04d8-0f27-4994-b468-43620ffd1bb0/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1079.690775] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/5afe04d8-0f27-4994-b468-43620ffd1bb0/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1079.691615] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-228f273c-555c-485e-be68-44cab5849d20 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.698934] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33bda49d-a4e6-40cf-a2b8-40abe3abea9e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.708364] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-064b62af-4128-417a-af76-1397c9e95c09 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.745094] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c1eeef5-1d44-454d-b404-3fc29306650f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.747836] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1079.748057] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1079.748296] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Deleting the datastore file [datastore2] 1829c328-2a68-4297-b80e-afb0e898ba72 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1079.748535] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e725bbb5-88fe-4d69-bd8d-b3f1f7968be2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.753676] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b9b178cb-b8bb-4d0c-aa5e-c2ba6f8f1f5d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.756738] env[62277]: DEBUG oslo_vmware.api [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Waiting for the task: (returnval){ [ 1079.756738] env[62277]: value = "task-1405336" [ 1079.756738] env[62277]: _type = "Task" [ 1079.756738] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1079.764813] env[62277]: DEBUG oslo_vmware.api [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Task: {'id': task-1405336, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1079.776316] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1079.832218] env[62277]: DEBUG oslo_vmware.rw_handles [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5afe04d8-0f27-4994-b468-43620ffd1bb0/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1079.890125] env[62277]: DEBUG oslo_vmware.rw_handles [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1079.890335] env[62277]: DEBUG oslo_vmware.rw_handles [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5afe04d8-0f27-4994-b468-43620ffd1bb0/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1080.267784] env[62277]: DEBUG oslo_vmware.api [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Task: {'id': task-1405336, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072701} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1080.268107] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1080.268417] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1080.268417] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1080.268544] env[62277]: INFO nova.compute.manager [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1080.272062] env[62277]: DEBUG nova.compute.claims [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1080.274878] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1080.274878] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1080.638213] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15b5615f-b37e-4c90-bd5c-513a85e62d47 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.647701] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1b71e39-b20c-4ea4-881e-0f35aaa76126 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.676237] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f76ff73d-a5e3-404a-8feb-7a9ea57c8b04 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.683594] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82f8f30e-7c23-4293-ab0c-1bb823f48578 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1080.697061] env[62277]: DEBUG nova.compute.provider_tree [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1080.705502] env[62277]: DEBUG nova.scheduler.client.report [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1080.718757] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.446s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1080.719267] env[62277]: ERROR nova.compute.manager [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1080.719267] env[62277]: Faults: ['InvalidArgument'] [ 1080.719267] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Traceback (most recent call last): [ 1080.719267] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1080.719267] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] self.driver.spawn(context, instance, image_meta, [ 1080.719267] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1080.719267] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1080.719267] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1080.719267] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] self._fetch_image_if_missing(context, vi) [ 1080.719267] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1080.719267] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] image_cache(vi, tmp_image_ds_loc) [ 1080.719267] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1080.719635] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] vm_util.copy_virtual_disk( [ 1080.719635] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1080.719635] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] session._wait_for_task(vmdk_copy_task) [ 1080.719635] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1080.719635] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] return self.wait_for_task(task_ref) [ 1080.719635] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1080.719635] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] return evt.wait() [ 1080.719635] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1080.719635] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] result = hub.switch() [ 1080.719635] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1080.719635] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] return self.greenlet.switch() [ 1080.719635] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1080.719635] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] self.f(*self.args, **self.kw) [ 1080.720040] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1080.720040] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] raise exceptions.translate_fault(task_info.error) [ 1080.720040] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1080.720040] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Faults: ['InvalidArgument'] [ 1080.720040] env[62277]: ERROR nova.compute.manager [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] [ 1080.720040] env[62277]: DEBUG nova.compute.utils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1080.721386] env[62277]: DEBUG nova.compute.manager [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Build of instance 1829c328-2a68-4297-b80e-afb0e898ba72 was re-scheduled: A specified parameter was not correct: fileType [ 1080.721386] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1080.721819] env[62277]: DEBUG nova.compute.manager [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1080.722008] env[62277]: DEBUG nova.compute.manager [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1080.722202] env[62277]: DEBUG nova.compute.manager [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1080.722330] env[62277]: DEBUG nova.network.neutron [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1081.034943] env[62277]: DEBUG nova.network.neutron [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1081.048420] env[62277]: INFO nova.compute.manager [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: 1829c328-2a68-4297-b80e-afb0e898ba72] Took 0.33 seconds to deallocate network for instance. [ 1081.139898] env[62277]: INFO nova.scheduler.client.report [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Deleted allocations for instance 1829c328-2a68-4297-b80e-afb0e898ba72 [ 1081.157891] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b2d8d492-83bc-47a7-a085-54075639cec7 tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Lock "1829c328-2a68-4297-b80e-afb0e898ba72" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 101.425s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1081.172638] env[62277]: DEBUG nova.compute.manager [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1081.233477] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1081.233477] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1081.234928] env[62277]: INFO nova.compute.claims [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1081.643793] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5caf54b-3b05-497b-81de-44b82838eee7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1081.651899] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fbd480d-df1e-4707-9e67-735f1c94eec0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1081.682014] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ebae365-fbf0-42df-8cd3-f1ac1f16a940 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1081.689421] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eee8cf11-f584-46a5-a3c9-f3c8e4c9e864 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1081.702312] env[62277]: DEBUG nova.compute.provider_tree [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1081.713473] env[62277]: DEBUG nova.scheduler.client.report [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1081.726910] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.494s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1081.727546] env[62277]: DEBUG nova.compute.manager [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1081.760149] env[62277]: DEBUG nova.compute.utils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1081.764288] env[62277]: DEBUG nova.compute.manager [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1081.764288] env[62277]: DEBUG nova.network.neutron [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1081.771673] env[62277]: DEBUG nova.compute.manager [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1081.836688] env[62277]: DEBUG nova.policy [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9eae0bcac16344fb8d4ef6d04a8046e5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16b0f040e4024e61afb1ba610c202eb1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1081.839557] env[62277]: DEBUG nova.compute.manager [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1081.869265] env[62277]: DEBUG nova.virt.hardware [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1081.869492] env[62277]: DEBUG nova.virt.hardware [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1081.869642] env[62277]: DEBUG nova.virt.hardware [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1081.869862] env[62277]: DEBUG nova.virt.hardware [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1081.870019] env[62277]: DEBUG nova.virt.hardware [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1081.870160] env[62277]: DEBUG nova.virt.hardware [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1081.870357] env[62277]: DEBUG nova.virt.hardware [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1081.870504] env[62277]: DEBUG nova.virt.hardware [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1081.870657] env[62277]: DEBUG nova.virt.hardware [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1081.870807] env[62277]: DEBUG nova.virt.hardware [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1081.870967] env[62277]: DEBUG nova.virt.hardware [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1081.871858] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca843590-872f-45e0-a34b-4cf952cfc9b2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1081.882344] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e931962-82aa-4b39-b451-94653b927724 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1082.193071] env[62277]: DEBUG nova.network.neutron [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Successfully created port: d27d678f-ba40-47aa-9a52-70d9241dd04e {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1082.994307] env[62277]: DEBUG nova.network.neutron [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Successfully updated port: d27d678f-ba40-47aa-9a52-70d9241dd04e {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1083.006321] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Acquiring lock "refresh_cache-23bc5a48-3e96-4897-bf28-ad14a0bdde62" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1083.006473] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Acquired lock "refresh_cache-23bc5a48-3e96-4897-bf28-ad14a0bdde62" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1083.006621] env[62277]: DEBUG nova.network.neutron [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1083.072309] env[62277]: DEBUG nova.network.neutron [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1083.205507] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ca2c13e-fb65-4eb0-8646-8fef9a5fc83f tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Acquiring lock "c000e183-2e57-470e-a9a5-30b5899e77c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1083.205743] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ca2c13e-fb65-4eb0-8646-8fef9a5fc83f tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Lock "c000e183-2e57-470e-a9a5-30b5899e77c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1083.337160] env[62277]: DEBUG nova.compute.manager [req-a7e31e90-fb31-4786-8a65-11c1ab925868 req-11e3d1e6-81aa-4fba-9dda-fea555568c62 service nova] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Received event network-vif-plugged-d27d678f-ba40-47aa-9a52-70d9241dd04e {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1083.337385] env[62277]: DEBUG oslo_concurrency.lockutils [req-a7e31e90-fb31-4786-8a65-11c1ab925868 req-11e3d1e6-81aa-4fba-9dda-fea555568c62 service nova] Acquiring lock "23bc5a48-3e96-4897-bf28-ad14a0bdde62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1083.337601] env[62277]: DEBUG oslo_concurrency.lockutils [req-a7e31e90-fb31-4786-8a65-11c1ab925868 req-11e3d1e6-81aa-4fba-9dda-fea555568c62 service nova] Lock "23bc5a48-3e96-4897-bf28-ad14a0bdde62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1083.337771] env[62277]: DEBUG oslo_concurrency.lockutils [req-a7e31e90-fb31-4786-8a65-11c1ab925868 req-11e3d1e6-81aa-4fba-9dda-fea555568c62 service nova] Lock "23bc5a48-3e96-4897-bf28-ad14a0bdde62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1083.337936] env[62277]: DEBUG nova.compute.manager [req-a7e31e90-fb31-4786-8a65-11c1ab925868 req-11e3d1e6-81aa-4fba-9dda-fea555568c62 service nova] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] No waiting events found dispatching network-vif-plugged-d27d678f-ba40-47aa-9a52-70d9241dd04e {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1083.338113] env[62277]: WARNING nova.compute.manager [req-a7e31e90-fb31-4786-8a65-11c1ab925868 req-11e3d1e6-81aa-4fba-9dda-fea555568c62 service nova] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Received unexpected event network-vif-plugged-d27d678f-ba40-47aa-9a52-70d9241dd04e for instance with vm_state building and task_state spawning. [ 1083.338275] env[62277]: DEBUG nova.compute.manager [req-a7e31e90-fb31-4786-8a65-11c1ab925868 req-11e3d1e6-81aa-4fba-9dda-fea555568c62 service nova] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Received event network-changed-d27d678f-ba40-47aa-9a52-70d9241dd04e {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1083.338424] env[62277]: DEBUG nova.compute.manager [req-a7e31e90-fb31-4786-8a65-11c1ab925868 req-11e3d1e6-81aa-4fba-9dda-fea555568c62 service nova] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Refreshing instance network info cache due to event network-changed-d27d678f-ba40-47aa-9a52-70d9241dd04e. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1083.338586] env[62277]: DEBUG oslo_concurrency.lockutils [req-a7e31e90-fb31-4786-8a65-11c1ab925868 req-11e3d1e6-81aa-4fba-9dda-fea555568c62 service nova] Acquiring lock "refresh_cache-23bc5a48-3e96-4897-bf28-ad14a0bdde62" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1083.486556] env[62277]: DEBUG nova.network.neutron [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Updating instance_info_cache with network_info: [{"id": "d27d678f-ba40-47aa-9a52-70d9241dd04e", "address": "fa:16:3e:fe:ab:7f", "network": {"id": "ba421bb8-29ee-4e25-a479-0fd2fc1d19ad", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-154938157-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "16b0f040e4024e61afb1ba610c202eb1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d298db54-f13d-4bf6-b6c2-755074b3047f", "external-id": "nsx-vlan-transportzone-631", "segmentation_id": 631, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd27d678f-ba", "ovs_interfaceid": "d27d678f-ba40-47aa-9a52-70d9241dd04e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1083.497975] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Releasing lock "refresh_cache-23bc5a48-3e96-4897-bf28-ad14a0bdde62" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1083.498301] env[62277]: DEBUG nova.compute.manager [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Instance network_info: |[{"id": "d27d678f-ba40-47aa-9a52-70d9241dd04e", "address": "fa:16:3e:fe:ab:7f", "network": {"id": "ba421bb8-29ee-4e25-a479-0fd2fc1d19ad", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-154938157-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "16b0f040e4024e61afb1ba610c202eb1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d298db54-f13d-4bf6-b6c2-755074b3047f", "external-id": "nsx-vlan-transportzone-631", "segmentation_id": 631, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd27d678f-ba", "ovs_interfaceid": "d27d678f-ba40-47aa-9a52-70d9241dd04e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1083.498599] env[62277]: DEBUG oslo_concurrency.lockutils [req-a7e31e90-fb31-4786-8a65-11c1ab925868 req-11e3d1e6-81aa-4fba-9dda-fea555568c62 service nova] Acquired lock "refresh_cache-23bc5a48-3e96-4897-bf28-ad14a0bdde62" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1083.499390] env[62277]: DEBUG nova.network.neutron [req-a7e31e90-fb31-4786-8a65-11c1ab925868 req-11e3d1e6-81aa-4fba-9dda-fea555568c62 service nova] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Refreshing network info cache for port d27d678f-ba40-47aa-9a52-70d9241dd04e {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1083.499820] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:fe:ab:7f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd298db54-f13d-4bf6-b6c2-755074b3047f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd27d678f-ba40-47aa-9a52-70d9241dd04e', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1083.507921] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Creating folder: Project (16b0f040e4024e61afb1ba610c202eb1). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1083.510705] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5b3577e0-75be-422c-bdcf-48a4a46ccd43 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1083.521274] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Created folder: Project (16b0f040e4024e61afb1ba610c202eb1) in parent group-v297781. [ 1083.521460] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Creating folder: Instances. Parent ref: group-v297820. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1083.521689] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ac324eb2-198e-4fc9-9a47-98f3e0a55b1f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1083.529781] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Created folder: Instances in parent group-v297820. [ 1083.529929] env[62277]: DEBUG oslo.service.loopingcall [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1083.530145] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1083.530334] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b051c4ce-ed5d-4811-9dfe-e5ede51b91c9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1083.553030] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1083.553030] env[62277]: value = "task-1405339" [ 1083.553030] env[62277]: _type = "Task" [ 1083.553030] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1083.560484] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405339, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1083.852773] env[62277]: DEBUG nova.network.neutron [req-a7e31e90-fb31-4786-8a65-11c1ab925868 req-11e3d1e6-81aa-4fba-9dda-fea555568c62 service nova] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Updated VIF entry in instance network info cache for port d27d678f-ba40-47aa-9a52-70d9241dd04e. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1083.853159] env[62277]: DEBUG nova.network.neutron [req-a7e31e90-fb31-4786-8a65-11c1ab925868 req-11e3d1e6-81aa-4fba-9dda-fea555568c62 service nova] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Updating instance_info_cache with network_info: [{"id": "d27d678f-ba40-47aa-9a52-70d9241dd04e", "address": "fa:16:3e:fe:ab:7f", "network": {"id": "ba421bb8-29ee-4e25-a479-0fd2fc1d19ad", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-154938157-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "16b0f040e4024e61afb1ba610c202eb1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d298db54-f13d-4bf6-b6c2-755074b3047f", "external-id": "nsx-vlan-transportzone-631", "segmentation_id": 631, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd27d678f-ba", "ovs_interfaceid": "d27d678f-ba40-47aa-9a52-70d9241dd04e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1083.862676] env[62277]: DEBUG oslo_concurrency.lockutils [req-a7e31e90-fb31-4786-8a65-11c1ab925868 req-11e3d1e6-81aa-4fba-9dda-fea555568c62 service nova] Releasing lock "refresh_cache-23bc5a48-3e96-4897-bf28-ad14a0bdde62" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1084.064376] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405339, 'name': CreateVM_Task, 'duration_secs': 0.310669} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1084.064376] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1084.064376] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1084.064376] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1084.064767] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1084.064803] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-34d8a5fa-b1e3-4664-b55e-eccf502ed90d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1084.069250] env[62277]: DEBUG oslo_vmware.api [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Waiting for the task: (returnval){ [ 1084.069250] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52925abe-2abb-eb94-4e6d-4636153dcdf4" [ 1084.069250] env[62277]: _type = "Task" [ 1084.069250] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1084.081783] env[62277]: DEBUG oslo_vmware.api [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52925abe-2abb-eb94-4e6d-4636153dcdf4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1084.580029] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1084.580029] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1084.580200] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1093.168645] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1093.168873] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 1093.168966] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 1093.189903] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1093.190053] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1093.190205] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1093.190322] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1093.190446] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1093.190566] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1093.190685] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1093.190806] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1093.190944] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1093.191043] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1093.191167] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 1094.168597] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1095.168343] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1096.168922] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1097.164398] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1097.169231] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1097.169231] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1097.179501] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1097.179714] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1097.179991] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1097.181048] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1097.182352] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16fef4f0-095c-4726-8795-e93663f5e641 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1097.190846] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d761c4d-dec0-4b4b-80c5-453616ca81f1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1097.204772] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-775d382c-0819-4912-a7e4-674db4aa6438 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1097.211038] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-013b1de3-acf6-43e2-9dfd-e85245769d4f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1097.239479] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181420MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1097.239637] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1097.239831] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1097.316976] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 3d260cd8-ab21-4e1e-8891-6f216350a587 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1097.317166] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance d68ccb50-a04d-4e59-8161-f01305eb81a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1097.317294] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 19d15611-315f-4c4b-8f32-e5d00d0d8ca8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1097.317414] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance dfc291fd-1481-4e76-9fb3-ec87124c1281 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1097.317534] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 36ff1435-1999-4e95-8920-81a1b25cc452 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1097.317654] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 68925f1b-da69-4955-acb1-d6500b03daee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1097.317767] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 43255fee-e5da-4fe0-8fa7-4aba7592745b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1097.317878] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 866c4415-caab-4d81-86ba-ed662feb3c4f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1097.317990] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 350e2302-66b9-4dd6-b0f4-77000992408b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1097.318121] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 23bc5a48-3e96-4897-bf28-ad14a0bdde62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1097.329567] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 346748bd-b4e8-4e93-b71d-66c90a45e372 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.342917] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 930ff058-ab48-4c8a-8f5e-4820a3b12d50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.354242] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 154fa64c-55d4-4b72-8af9-39e72fd5df5f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.363994] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.373526] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 79177b7a-1bf6-4649-80c7-4ba2c6cda0ad has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.384814] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4cc3443d-3d05-4d3b-a222-6b7367c1c989 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.395371] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 04122dba-4cbf-4176-95bd-f1c4bfaa799e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.408512] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 30d7d279-7241-4f2e-b963-d205a5f9fa41 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.419077] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance fb905507-549f-4265-9851-4a42930c02a4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.431198] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4f5018a5-26f5-4a31-8a9d-d2557c906995 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.440648] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c01ad807-a9a5-4028-baf4-0469a6301459 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.450539] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 1e8429b2-7149-4832-8590-e0ebd8501176 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.460428] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance f646a534-1ae8-40dd-9819-3d71bda87ae2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.470195] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 2b98866e-3c86-47bd-9eff-2c2743631563 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.480184] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 560b7750-03fe-4a4c-ab1d-a1751895986b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.489671] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 86e8e8ba-e476-400d-b180-bb7df8a042d8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.498534] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c3e72352-f795-4ce7-9e0b-4e80c4329f7b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.507763] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance b6908c32-5916-4a0e-92e2-21f480c5f7ca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.519099] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 27cd13ca-a17c-476e-a00a-cca1fe898763 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.528352] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c000e183-2e57-470e-a9a5-30b5899e77c1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1097.528579] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1097.528723] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1097.855515] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47376bda-1604-4bd3-9200-996b8a2e12f7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1097.862853] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b0cc194-f9a1-4bb6-8adb-671aeee838ab {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1097.891701] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-439bf32d-a788-4fe9-9a2b-817273084804 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1097.898699] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-854dedf4-7f53-47ee-a5fd-72af0f03950c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1097.911153] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1097.919561] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1097.932624] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1097.932794] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.693s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1098.928599] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1098.950274] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1103.170435] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1103.170435] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 1106.940518] env[62277]: WARNING oslo_vmware.rw_handles [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1106.940518] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1106.940518] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1106.940518] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1106.940518] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1106.940518] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1106.940518] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1106.940518] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1106.940518] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1106.940518] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1106.940518] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1106.940518] env[62277]: ERROR oslo_vmware.rw_handles [ 1106.941172] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/b7191af3-61e4-4c8b-b087-6ce0f3f50e47/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore1 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1106.942577] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1106.942692] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Copying Virtual Disk [datastore1] vmware_temp/b7191af3-61e4-4c8b-b087-6ce0f3f50e47/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore1] vmware_temp/b7191af3-61e4-4c8b-b087-6ce0f3f50e47/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1106.943018] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9f64f87a-73c9-4d9a-94d9-7ba8b9d2ede2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1106.950805] env[62277]: DEBUG oslo_vmware.api [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Waiting for the task: (returnval){ [ 1106.950805] env[62277]: value = "task-1405340" [ 1106.950805] env[62277]: _type = "Task" [ 1106.950805] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1106.958654] env[62277]: DEBUG oslo_vmware.api [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Task: {'id': task-1405340, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1107.460772] env[62277]: DEBUG oslo_vmware.exceptions [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1107.461076] env[62277]: DEBUG oslo_concurrency.lockutils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1107.461620] env[62277]: ERROR nova.compute.manager [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1107.461620] env[62277]: Faults: ['InvalidArgument'] [ 1107.461620] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Traceback (most recent call last): [ 1107.461620] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1107.461620] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] yield resources [ 1107.461620] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1107.461620] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] self.driver.spawn(context, instance, image_meta, [ 1107.461620] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1107.461620] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1107.461620] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1107.461620] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] self._fetch_image_if_missing(context, vi) [ 1107.461620] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1107.461989] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] image_cache(vi, tmp_image_ds_loc) [ 1107.461989] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1107.461989] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] vm_util.copy_virtual_disk( [ 1107.461989] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1107.461989] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] session._wait_for_task(vmdk_copy_task) [ 1107.461989] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1107.461989] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] return self.wait_for_task(task_ref) [ 1107.461989] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1107.461989] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] return evt.wait() [ 1107.461989] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1107.461989] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] result = hub.switch() [ 1107.461989] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1107.461989] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] return self.greenlet.switch() [ 1107.462381] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1107.462381] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] self.f(*self.args, **self.kw) [ 1107.462381] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1107.462381] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] raise exceptions.translate_fault(task_info.error) [ 1107.462381] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1107.462381] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Faults: ['InvalidArgument'] [ 1107.462381] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] [ 1107.462381] env[62277]: INFO nova.compute.manager [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Terminating instance [ 1107.464418] env[62277]: DEBUG nova.compute.manager [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1107.464507] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1107.465244] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6f7dc91-f85b-449d-b1ca-2962437e6c60 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1107.471888] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1107.472161] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d5b6d270-00df-4564-8d87-8a859cd25acb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1107.539310] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1107.539540] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Deleting contents of the VM from datastore datastore1 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1107.539719] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Deleting the datastore file [datastore1] 43255fee-e5da-4fe0-8fa7-4aba7592745b {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1107.539997] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-bed2b796-6f2a-418c-9b47-4b161a3ce734 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1107.549014] env[62277]: DEBUG oslo_vmware.api [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Waiting for the task: (returnval){ [ 1107.549014] env[62277]: value = "task-1405342" [ 1107.549014] env[62277]: _type = "Task" [ 1107.549014] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1107.555021] env[62277]: DEBUG oslo_vmware.api [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Task: {'id': task-1405342, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1108.059995] env[62277]: DEBUG oslo_vmware.api [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Task: {'id': task-1405342, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070545} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1108.060273] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1108.060437] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Deleted contents of the VM from datastore datastore1 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1108.060601] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1108.060769] env[62277]: INFO nova.compute.manager [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1108.063402] env[62277]: DEBUG nova.compute.claims [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1108.063578] env[62277]: DEBUG oslo_concurrency.lockutils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1108.063786] env[62277]: DEBUG oslo_concurrency.lockutils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1108.428010] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-390ff037-99aa-4f04-a5f3-16af700c5b4e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.436710] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63c10dee-1637-436c-bbce-16ab940992fd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.465520] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33ced170-963e-43bf-a732-5b19bea4b2d9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.472383] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dba3248e-fc33-449b-bf3a-1bb68be0e9fe {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1108.485097] env[62277]: DEBUG nova.compute.provider_tree [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1108.493665] env[62277]: DEBUG nova.scheduler.client.report [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1108.511198] env[62277]: DEBUG oslo_concurrency.lockutils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.447s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1108.511301] env[62277]: ERROR nova.compute.manager [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1108.511301] env[62277]: Faults: ['InvalidArgument'] [ 1108.511301] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Traceback (most recent call last): [ 1108.511301] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1108.511301] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] self.driver.spawn(context, instance, image_meta, [ 1108.511301] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1108.511301] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1108.511301] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1108.511301] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] self._fetch_image_if_missing(context, vi) [ 1108.511301] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1108.511301] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] image_cache(vi, tmp_image_ds_loc) [ 1108.511301] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1108.511588] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] vm_util.copy_virtual_disk( [ 1108.511588] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1108.511588] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] session._wait_for_task(vmdk_copy_task) [ 1108.511588] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1108.511588] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] return self.wait_for_task(task_ref) [ 1108.511588] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1108.511588] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] return evt.wait() [ 1108.511588] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1108.511588] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] result = hub.switch() [ 1108.511588] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1108.511588] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] return self.greenlet.switch() [ 1108.511588] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1108.511588] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] self.f(*self.args, **self.kw) [ 1108.511869] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1108.511869] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] raise exceptions.translate_fault(task_info.error) [ 1108.511869] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1108.511869] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Faults: ['InvalidArgument'] [ 1108.511869] env[62277]: ERROR nova.compute.manager [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] [ 1108.511986] env[62277]: DEBUG nova.compute.utils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1108.513445] env[62277]: DEBUG nova.compute.manager [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Build of instance 43255fee-e5da-4fe0-8fa7-4aba7592745b was re-scheduled: A specified parameter was not correct: fileType [ 1108.513445] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1108.513800] env[62277]: DEBUG nova.compute.manager [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1108.513970] env[62277]: DEBUG nova.compute.manager [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1108.514147] env[62277]: DEBUG nova.compute.manager [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1108.514305] env[62277]: DEBUG nova.network.neutron [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1108.813712] env[62277]: DEBUG nova.network.neutron [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1108.826869] env[62277]: INFO nova.compute.manager [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 43255fee-e5da-4fe0-8fa7-4aba7592745b] Took 0.31 seconds to deallocate network for instance. [ 1108.932127] env[62277]: INFO nova.scheduler.client.report [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Deleted allocations for instance 43255fee-e5da-4fe0-8fa7-4aba7592745b [ 1108.957390] env[62277]: DEBUG oslo_concurrency.lockutils [None req-37041550-d623-4f32-9d28-8037a940ce91 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Lock "43255fee-e5da-4fe0-8fa7-4aba7592745b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 103.141s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1108.968731] env[62277]: DEBUG nova.compute.manager [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1109.019683] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1109.019683] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1109.020906] env[62277]: INFO nova.compute.claims [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1109.405435] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcf71fe0-8f71-4b7a-ba66-6bcf421aa223 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1109.413453] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b63653f2-4df4-4703-bba4-7f0d95740808 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1109.442782] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03f0e68c-6a31-43bb-9fc8-4a0569b05290 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1109.449887] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18ef5bad-e5c1-40fe-9c01-371a12e18c49 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1109.462876] env[62277]: DEBUG nova.compute.provider_tree [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1109.472110] env[62277]: DEBUG nova.scheduler.client.report [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1109.484381] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.465s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1109.484847] env[62277]: DEBUG nova.compute.manager [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1109.519028] env[62277]: DEBUG nova.compute.utils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1109.520686] env[62277]: DEBUG nova.compute.manager [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1109.520864] env[62277]: DEBUG nova.network.neutron [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1109.532806] env[62277]: DEBUG nova.compute.manager [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1109.596388] env[62277]: DEBUG nova.compute.manager [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1109.622683] env[62277]: DEBUG nova.policy [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ddb10ec35cda47c4850c8c72dd419f7b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '26e9cf6618c04f39a1644982221981ee', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1109.626386] env[62277]: DEBUG nova.virt.hardware [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1109.626616] env[62277]: DEBUG nova.virt.hardware [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1109.626769] env[62277]: DEBUG nova.virt.hardware [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1109.626948] env[62277]: DEBUG nova.virt.hardware [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1109.627104] env[62277]: DEBUG nova.virt.hardware [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1109.627248] env[62277]: DEBUG nova.virt.hardware [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1109.627444] env[62277]: DEBUG nova.virt.hardware [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1109.627597] env[62277]: DEBUG nova.virt.hardware [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1109.627755] env[62277]: DEBUG nova.virt.hardware [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1109.627910] env[62277]: DEBUG nova.virt.hardware [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1109.628088] env[62277]: DEBUG nova.virt.hardware [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1109.628931] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e6a5e55-c5a7-419f-87c5-52a902b6b852 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1109.637041] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c573ead9-a470-48e9-8483-a99012dea89c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1109.677956] env[62277]: DEBUG oslo_concurrency.lockutils [None req-22fb2479-820d-46db-8404-196b9598fe72 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquiring lock "36ff1435-1999-4e95-8920-81a1b25cc452" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1110.100558] env[62277]: DEBUG nova.network.neutron [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Successfully created port: 028ff50a-13c6-4a41-99f1-93a1d2fa2fa4 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1110.853156] env[62277]: DEBUG nova.network.neutron [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Successfully updated port: 028ff50a-13c6-4a41-99f1-93a1d2fa2fa4 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1110.874798] env[62277]: DEBUG nova.compute.manager [req-d008b597-178a-469b-96ff-71aec357d771 req-509ff628-be70-4569-953b-7f26120b374f service nova] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Received event network-vif-plugged-028ff50a-13c6-4a41-99f1-93a1d2fa2fa4 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1110.875039] env[62277]: DEBUG oslo_concurrency.lockutils [req-d008b597-178a-469b-96ff-71aec357d771 req-509ff628-be70-4569-953b-7f26120b374f service nova] Acquiring lock "346748bd-b4e8-4e93-b71d-66c90a45e372-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1110.875232] env[62277]: DEBUG oslo_concurrency.lockutils [req-d008b597-178a-469b-96ff-71aec357d771 req-509ff628-be70-4569-953b-7f26120b374f service nova] Lock "346748bd-b4e8-4e93-b71d-66c90a45e372-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1110.875399] env[62277]: DEBUG oslo_concurrency.lockutils [req-d008b597-178a-469b-96ff-71aec357d771 req-509ff628-be70-4569-953b-7f26120b374f service nova] Lock "346748bd-b4e8-4e93-b71d-66c90a45e372-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1110.875562] env[62277]: DEBUG nova.compute.manager [req-d008b597-178a-469b-96ff-71aec357d771 req-509ff628-be70-4569-953b-7f26120b374f service nova] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] No waiting events found dispatching network-vif-plugged-028ff50a-13c6-4a41-99f1-93a1d2fa2fa4 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1110.875803] env[62277]: WARNING nova.compute.manager [req-d008b597-178a-469b-96ff-71aec357d771 req-509ff628-be70-4569-953b-7f26120b374f service nova] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Received unexpected event network-vif-plugged-028ff50a-13c6-4a41-99f1-93a1d2fa2fa4 for instance with vm_state building and task_state spawning. [ 1110.876970] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Acquiring lock "refresh_cache-346748bd-b4e8-4e93-b71d-66c90a45e372" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1110.877076] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Acquired lock "refresh_cache-346748bd-b4e8-4e93-b71d-66c90a45e372" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1110.877224] env[62277]: DEBUG nova.network.neutron [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1110.941838] env[62277]: DEBUG nova.network.neutron [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1111.214220] env[62277]: DEBUG nova.network.neutron [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Updating instance_info_cache with network_info: [{"id": "028ff50a-13c6-4a41-99f1-93a1d2fa2fa4", "address": "fa:16:3e:87:7a:1b", "network": {"id": "2c64e00f-4995-4827-b685-5096b1d1d064", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.188", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "63ed6630e9c140baa826f53d7a0564d1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap028ff50a-13", "ovs_interfaceid": "028ff50a-13c6-4a41-99f1-93a1d2fa2fa4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1111.242976] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Releasing lock "refresh_cache-346748bd-b4e8-4e93-b71d-66c90a45e372" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1111.243355] env[62277]: DEBUG nova.compute.manager [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Instance network_info: |[{"id": "028ff50a-13c6-4a41-99f1-93a1d2fa2fa4", "address": "fa:16:3e:87:7a:1b", "network": {"id": "2c64e00f-4995-4827-b685-5096b1d1d064", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.188", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "63ed6630e9c140baa826f53d7a0564d1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap028ff50a-13", "ovs_interfaceid": "028ff50a-13c6-4a41-99f1-93a1d2fa2fa4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1111.243752] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:87:7a:1b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '77aa121f-8fb6-42f3-aaea-43addfe449b2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '028ff50a-13c6-4a41-99f1-93a1d2fa2fa4', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1111.251511] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Creating folder: Project (26e9cf6618c04f39a1644982221981ee). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1111.252116] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-aec1c593-5abc-47f5-b448-be6b97334caf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1111.264187] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Created folder: Project (26e9cf6618c04f39a1644982221981ee) in parent group-v297781. [ 1111.264372] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Creating folder: Instances. Parent ref: group-v297823. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1111.264639] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e1ad121a-f5f6-4603-8313-a35662110352 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1111.273744] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Created folder: Instances in parent group-v297823. [ 1111.273970] env[62277]: DEBUG oslo.service.loopingcall [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1111.274165] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1111.274359] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0069218b-ce5a-41a4-b132-7234beddb0b5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1111.294935] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1111.294935] env[62277]: value = "task-1405345" [ 1111.294935] env[62277]: _type = "Task" [ 1111.294935] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1111.302450] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405345, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1111.804912] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405345, 'name': CreateVM_Task, 'duration_secs': 0.317429} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1111.805100] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1111.805747] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1111.805904] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1111.806232] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1111.806482] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-14d019cf-af15-4b47-9977-9c3869847d30 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1111.810737] env[62277]: DEBUG oslo_vmware.api [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Waiting for the task: (returnval){ [ 1111.810737] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5268911b-ac95-92f2-f905-7e144e7bdd47" [ 1111.810737] env[62277]: _type = "Task" [ 1111.810737] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1111.819640] env[62277]: DEBUG oslo_vmware.api [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5268911b-ac95-92f2-f905-7e144e7bdd47, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1112.322415] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1112.322956] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1112.322956] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1112.926666] env[62277]: DEBUG nova.compute.manager [req-7dcb9212-5ec3-4745-9c1e-29659f13edc9 req-1ddb1cb8-e092-48f1-8025-8e6840b093ec service nova] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Received event network-changed-028ff50a-13c6-4a41-99f1-93a1d2fa2fa4 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1112.926879] env[62277]: DEBUG nova.compute.manager [req-7dcb9212-5ec3-4745-9c1e-29659f13edc9 req-1ddb1cb8-e092-48f1-8025-8e6840b093ec service nova] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Refreshing instance network info cache due to event network-changed-028ff50a-13c6-4a41-99f1-93a1d2fa2fa4. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1112.927079] env[62277]: DEBUG oslo_concurrency.lockutils [req-7dcb9212-5ec3-4745-9c1e-29659f13edc9 req-1ddb1cb8-e092-48f1-8025-8e6840b093ec service nova] Acquiring lock "refresh_cache-346748bd-b4e8-4e93-b71d-66c90a45e372" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1112.927204] env[62277]: DEBUG oslo_concurrency.lockutils [req-7dcb9212-5ec3-4745-9c1e-29659f13edc9 req-1ddb1cb8-e092-48f1-8025-8e6840b093ec service nova] Acquired lock "refresh_cache-346748bd-b4e8-4e93-b71d-66c90a45e372" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1112.927363] env[62277]: DEBUG nova.network.neutron [req-7dcb9212-5ec3-4745-9c1e-29659f13edc9 req-1ddb1cb8-e092-48f1-8025-8e6840b093ec service nova] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Refreshing network info cache for port 028ff50a-13c6-4a41-99f1-93a1d2fa2fa4 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1113.234142] env[62277]: DEBUG nova.network.neutron [req-7dcb9212-5ec3-4745-9c1e-29659f13edc9 req-1ddb1cb8-e092-48f1-8025-8e6840b093ec service nova] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Updated VIF entry in instance network info cache for port 028ff50a-13c6-4a41-99f1-93a1d2fa2fa4. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1113.234636] env[62277]: DEBUG nova.network.neutron [req-7dcb9212-5ec3-4745-9c1e-29659f13edc9 req-1ddb1cb8-e092-48f1-8025-8e6840b093ec service nova] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Updating instance_info_cache with network_info: [{"id": "028ff50a-13c6-4a41-99f1-93a1d2fa2fa4", "address": "fa:16:3e:87:7a:1b", "network": {"id": "2c64e00f-4995-4827-b685-5096b1d1d064", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.188", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "63ed6630e9c140baa826f53d7a0564d1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap028ff50a-13", "ovs_interfaceid": "028ff50a-13c6-4a41-99f1-93a1d2fa2fa4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1113.257780] env[62277]: DEBUG oslo_concurrency.lockutils [req-7dcb9212-5ec3-4745-9c1e-29659f13edc9 req-1ddb1cb8-e092-48f1-8025-8e6840b093ec service nova] Releasing lock "refresh_cache-346748bd-b4e8-4e93-b71d-66c90a45e372" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1129.132627] env[62277]: WARNING oslo_vmware.rw_handles [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1129.132627] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1129.132627] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1129.132627] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1129.132627] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1129.132627] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1129.132627] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1129.132627] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1129.132627] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1129.132627] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1129.132627] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1129.132627] env[62277]: ERROR oslo_vmware.rw_handles [ 1129.133214] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/5afe04d8-0f27-4994-b468-43620ffd1bb0/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1129.134703] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1129.134944] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Copying Virtual Disk [datastore2] vmware_temp/5afe04d8-0f27-4994-b468-43620ffd1bb0/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/5afe04d8-0f27-4994-b468-43620ffd1bb0/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1129.135240] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4a813161-32d2-4f06-ad3e-57b90b2d93ad {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1129.144244] env[62277]: DEBUG oslo_vmware.api [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Waiting for the task: (returnval){ [ 1129.144244] env[62277]: value = "task-1405346" [ 1129.144244] env[62277]: _type = "Task" [ 1129.144244] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1129.152987] env[62277]: DEBUG oslo_vmware.api [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Task: {'id': task-1405346, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1129.654826] env[62277]: DEBUG oslo_vmware.exceptions [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1129.655127] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1129.655667] env[62277]: ERROR nova.compute.manager [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1129.655667] env[62277]: Faults: ['InvalidArgument'] [ 1129.655667] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Traceback (most recent call last): [ 1129.655667] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1129.655667] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] yield resources [ 1129.655667] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1129.655667] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] self.driver.spawn(context, instance, image_meta, [ 1129.655667] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1129.655667] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1129.655667] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1129.655667] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] self._fetch_image_if_missing(context, vi) [ 1129.655667] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1129.656023] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] image_cache(vi, tmp_image_ds_loc) [ 1129.656023] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1129.656023] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] vm_util.copy_virtual_disk( [ 1129.656023] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1129.656023] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] session._wait_for_task(vmdk_copy_task) [ 1129.656023] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1129.656023] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] return self.wait_for_task(task_ref) [ 1129.656023] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1129.656023] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] return evt.wait() [ 1129.656023] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1129.656023] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] result = hub.switch() [ 1129.656023] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1129.656023] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] return self.greenlet.switch() [ 1129.656372] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1129.656372] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] self.f(*self.args, **self.kw) [ 1129.656372] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1129.656372] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] raise exceptions.translate_fault(task_info.error) [ 1129.656372] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1129.656372] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Faults: ['InvalidArgument'] [ 1129.656372] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] [ 1129.656372] env[62277]: INFO nova.compute.manager [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Terminating instance [ 1129.657482] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1129.657691] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1129.657928] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5b2bc407-e864-4e69-83e3-6e676ab0e131 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1129.660296] env[62277]: DEBUG nova.compute.manager [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1129.660511] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1129.661248] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d16eae8-b0a6-4938-8b72-ac73db2454d7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1129.667971] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1129.668197] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6612aa20-d691-43f6-815f-dd6e7aee1988 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1129.670377] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1129.670553] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1129.671488] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bc0fa632-eec0-438d-81ff-afb3f57fbb24 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1129.676282] env[62277]: DEBUG oslo_vmware.api [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Waiting for the task: (returnval){ [ 1129.676282] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52e98115-cee1-173b-f642-8ae8a32a18b4" [ 1129.676282] env[62277]: _type = "Task" [ 1129.676282] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1129.683046] env[62277]: DEBUG oslo_vmware.api [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52e98115-cee1-173b-f642-8ae8a32a18b4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1129.735375] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1129.735619] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1129.735827] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Deleting the datastore file [datastore2] 3d260cd8-ab21-4e1e-8891-6f216350a587 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1129.736345] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-dd4019fb-c5f1-4a91-8361-0e1b9b19baa7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1129.743354] env[62277]: DEBUG oslo_vmware.api [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Waiting for the task: (returnval){ [ 1129.743354] env[62277]: value = "task-1405348" [ 1129.743354] env[62277]: _type = "Task" [ 1129.743354] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1129.751158] env[62277]: DEBUG oslo_vmware.api [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Task: {'id': task-1405348, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1130.187200] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1130.187477] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Creating directory with path [datastore2] vmware_temp/01a403d3-d766-4b1a-b3e8-db65ab3554ac/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1130.187715] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e1b60188-33e0-4d1a-8943-21c21d0055eb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1130.199454] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Created directory with path [datastore2] vmware_temp/01a403d3-d766-4b1a-b3e8-db65ab3554ac/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1130.199660] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Fetch image to [datastore2] vmware_temp/01a403d3-d766-4b1a-b3e8-db65ab3554ac/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1130.199830] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/01a403d3-d766-4b1a-b3e8-db65ab3554ac/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1130.200595] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37f23056-86ba-4b9e-b678-a2259f2a66ed {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1130.207446] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76fe08ae-1810-4a82-ab1e-53c75bd5d7bf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1130.216443] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1277f639-2ede-4de2-a565-69a5554abd24 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1130.251011] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79b96076-4624-47f4-aea8-36295cb0e646 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1130.258307] env[62277]: DEBUG oslo_vmware.api [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Task: {'id': task-1405348, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081421} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1130.259771] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1130.259961] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1130.260147] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1130.260320] env[62277]: INFO nova.compute.manager [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1130.262597] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fb3835c4-a405-41af-ae0c-e1882eba4966 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1130.264070] env[62277]: DEBUG nova.compute.claims [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1130.264199] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1130.264417] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1130.285253] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1130.346376] env[62277]: DEBUG oslo_vmware.rw_handles [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/01a403d3-d766-4b1a-b3e8-db65ab3554ac/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1130.408824] env[62277]: DEBUG oslo_vmware.rw_handles [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1130.409018] env[62277]: DEBUG oslo_vmware.rw_handles [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/01a403d3-d766-4b1a-b3e8-db65ab3554ac/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1130.707470] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36629b05-3dc5-4a0b-a974-6ddfe0eebdea {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1130.715482] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd1fe8df-d49c-4340-b6c0-80e95b4e0699 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1130.746260] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e35d90a-1832-411c-9125-624cfcd241e7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1130.753674] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-862eb5ac-b85a-4f19-b01c-e0542eab6042 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1130.766599] env[62277]: DEBUG nova.compute.provider_tree [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1130.775028] env[62277]: DEBUG nova.scheduler.client.report [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1130.788761] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.524s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1130.789305] env[62277]: ERROR nova.compute.manager [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1130.789305] env[62277]: Faults: ['InvalidArgument'] [ 1130.789305] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Traceback (most recent call last): [ 1130.789305] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1130.789305] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] self.driver.spawn(context, instance, image_meta, [ 1130.789305] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1130.789305] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1130.789305] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1130.789305] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] self._fetch_image_if_missing(context, vi) [ 1130.789305] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1130.789305] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] image_cache(vi, tmp_image_ds_loc) [ 1130.789305] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1130.789680] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] vm_util.copy_virtual_disk( [ 1130.789680] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1130.789680] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] session._wait_for_task(vmdk_copy_task) [ 1130.789680] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1130.789680] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] return self.wait_for_task(task_ref) [ 1130.789680] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1130.789680] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] return evt.wait() [ 1130.789680] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1130.789680] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] result = hub.switch() [ 1130.789680] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1130.789680] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] return self.greenlet.switch() [ 1130.789680] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1130.789680] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] self.f(*self.args, **self.kw) [ 1130.790657] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1130.790657] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] raise exceptions.translate_fault(task_info.error) [ 1130.790657] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1130.790657] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Faults: ['InvalidArgument'] [ 1130.790657] env[62277]: ERROR nova.compute.manager [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] [ 1130.790657] env[62277]: DEBUG nova.compute.utils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1130.791398] env[62277]: DEBUG nova.compute.manager [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Build of instance 3d260cd8-ab21-4e1e-8891-6f216350a587 was re-scheduled: A specified parameter was not correct: fileType [ 1130.791398] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1130.791770] env[62277]: DEBUG nova.compute.manager [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1130.791936] env[62277]: DEBUG nova.compute.manager [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1130.792099] env[62277]: DEBUG nova.compute.manager [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1130.792259] env[62277]: DEBUG nova.network.neutron [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1131.081422] env[62277]: DEBUG nova.network.neutron [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1131.093880] env[62277]: INFO nova.compute.manager [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 3d260cd8-ab21-4e1e-8891-6f216350a587] Took 0.30 seconds to deallocate network for instance. [ 1131.197303] env[62277]: INFO nova.scheduler.client.report [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Deleted allocations for instance 3d260cd8-ab21-4e1e-8891-6f216350a587 [ 1131.218139] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d7823f9-0204-4919-84fb-6b95e6fb811c tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Lock "3d260cd8-ab21-4e1e-8891-6f216350a587" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 147.829s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1131.232825] env[62277]: DEBUG nova.compute.manager [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1131.292391] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1131.292699] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1131.294420] env[62277]: INFO nova.compute.claims [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1131.656243] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b712375-ae13-448c-915f-456d48d42b86 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1131.664257] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae64018f-529e-4575-8e3a-b8e830590b62 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1131.694875] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c9e6526-da29-470e-bb0e-85807c48b5d0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1131.702382] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cf0b7b8-ae6e-4c84-8760-73b7ec865721 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1131.715537] env[62277]: DEBUG nova.compute.provider_tree [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1131.724374] env[62277]: DEBUG nova.scheduler.client.report [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1131.738289] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.446s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1131.738825] env[62277]: DEBUG nova.compute.manager [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1131.775614] env[62277]: DEBUG nova.compute.utils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1131.780455] env[62277]: DEBUG nova.compute.manager [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1131.780593] env[62277]: DEBUG nova.network.neutron [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1131.789758] env[62277]: DEBUG nova.compute.manager [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1131.842516] env[62277]: DEBUG nova.policy [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '567913dd8fd3447a86bdb0132cc8b862', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fdf9c637644943bfa8f2c698cdfaa268', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1131.862708] env[62277]: DEBUG nova.compute.manager [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1131.894511] env[62277]: DEBUG nova.virt.hardware [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:19:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='161441275',id=25,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_0-946435932',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1131.894778] env[62277]: DEBUG nova.virt.hardware [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1131.894983] env[62277]: DEBUG nova.virt.hardware [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1131.895140] env[62277]: DEBUG nova.virt.hardware [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1131.895285] env[62277]: DEBUG nova.virt.hardware [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1131.895427] env[62277]: DEBUG nova.virt.hardware [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1131.895633] env[62277]: DEBUG nova.virt.hardware [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1131.895789] env[62277]: DEBUG nova.virt.hardware [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1131.895951] env[62277]: DEBUG nova.virt.hardware [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1131.896132] env[62277]: DEBUG nova.virt.hardware [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1131.896307] env[62277]: DEBUG nova.virt.hardware [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1131.897154] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2a7bfc0-bd97-44dd-aa5c-5cf4db0ba815 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1131.905878] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6abae4d0-7b01-42d9-a3f8-4842d2476b4a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1132.445526] env[62277]: DEBUG nova.network.neutron [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Successfully created port: 53176b93-dde5-451c-a76f-a13e68157ef7 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1132.694098] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquiring lock "21bd4623-2b46-43c4-859f-c4d3bf261e1f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1132.694342] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Lock "21bd4623-2b46-43c4-859f-c4d3bf261e1f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1133.301049] env[62277]: DEBUG nova.network.neutron [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Successfully updated port: 53176b93-dde5-451c-a76f-a13e68157ef7 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1133.313966] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Acquiring lock "refresh_cache-930ff058-ab48-4c8a-8f5e-4820a3b12d50" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1133.314140] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Acquired lock "refresh_cache-930ff058-ab48-4c8a-8f5e-4820a3b12d50" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1133.314289] env[62277]: DEBUG nova.network.neutron [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1133.357570] env[62277]: DEBUG nova.compute.manager [req-b6d24012-8a3f-4aee-aa45-f89fa0888f02 req-94c8b523-49c8-4231-b17b-4915573fd474 service nova] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Received event network-vif-plugged-53176b93-dde5-451c-a76f-a13e68157ef7 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1133.357681] env[62277]: DEBUG oslo_concurrency.lockutils [req-b6d24012-8a3f-4aee-aa45-f89fa0888f02 req-94c8b523-49c8-4231-b17b-4915573fd474 service nova] Acquiring lock "930ff058-ab48-4c8a-8f5e-4820a3b12d50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1133.359337] env[62277]: DEBUG oslo_concurrency.lockutils [req-b6d24012-8a3f-4aee-aa45-f89fa0888f02 req-94c8b523-49c8-4231-b17b-4915573fd474 service nova] Lock "930ff058-ab48-4c8a-8f5e-4820a3b12d50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1133.359565] env[62277]: DEBUG oslo_concurrency.lockutils [req-b6d24012-8a3f-4aee-aa45-f89fa0888f02 req-94c8b523-49c8-4231-b17b-4915573fd474 service nova] Lock "930ff058-ab48-4c8a-8f5e-4820a3b12d50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.002s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1133.359742] env[62277]: DEBUG nova.compute.manager [req-b6d24012-8a3f-4aee-aa45-f89fa0888f02 req-94c8b523-49c8-4231-b17b-4915573fd474 service nova] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] No waiting events found dispatching network-vif-plugged-53176b93-dde5-451c-a76f-a13e68157ef7 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1133.359906] env[62277]: WARNING nova.compute.manager [req-b6d24012-8a3f-4aee-aa45-f89fa0888f02 req-94c8b523-49c8-4231-b17b-4915573fd474 service nova] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Received unexpected event network-vif-plugged-53176b93-dde5-451c-a76f-a13e68157ef7 for instance with vm_state building and task_state spawning. [ 1133.374569] env[62277]: DEBUG nova.network.neutron [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1133.583703] env[62277]: DEBUG nova.network.neutron [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Updating instance_info_cache with network_info: [{"id": "53176b93-dde5-451c-a76f-a13e68157ef7", "address": "fa:16:3e:34:06:74", "network": {"id": "2d9b4a6b-c2ff-44b3-b1cf-d86c844eba30", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-284483849-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fdf9c637644943bfa8f2c698cdfaa268", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a4307c18-b235-43cd-bcd5-e226012d8ee9", "external-id": "nsx-vlan-transportzone-867", "segmentation_id": 867, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap53176b93-dd", "ovs_interfaceid": "53176b93-dde5-451c-a76f-a13e68157ef7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1133.595549] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Releasing lock "refresh_cache-930ff058-ab48-4c8a-8f5e-4820a3b12d50" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1133.595850] env[62277]: DEBUG nova.compute.manager [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Instance network_info: |[{"id": "53176b93-dde5-451c-a76f-a13e68157ef7", "address": "fa:16:3e:34:06:74", "network": {"id": "2d9b4a6b-c2ff-44b3-b1cf-d86c844eba30", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-284483849-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fdf9c637644943bfa8f2c698cdfaa268", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a4307c18-b235-43cd-bcd5-e226012d8ee9", "external-id": "nsx-vlan-transportzone-867", "segmentation_id": 867, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap53176b93-dd", "ovs_interfaceid": "53176b93-dde5-451c-a76f-a13e68157ef7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1133.596249] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:34:06:74', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a4307c18-b235-43cd-bcd5-e226012d8ee9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '53176b93-dde5-451c-a76f-a13e68157ef7', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1133.603681] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Creating folder: Project (fdf9c637644943bfa8f2c698cdfaa268). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1133.604597] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-031e0cd5-c3e0-4b75-b4b6-5e2ef3e88792 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1133.615223] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Created folder: Project (fdf9c637644943bfa8f2c698cdfaa268) in parent group-v297781. [ 1133.615403] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Creating folder: Instances. Parent ref: group-v297826. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1133.615627] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b993d496-37d1-4b45-a92d-e79a1dc4b831 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1133.625259] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Created folder: Instances in parent group-v297826. [ 1133.625473] env[62277]: DEBUG oslo.service.loopingcall [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1133.625690] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1133.625898] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-626476f7-26f5-47c7-8eff-ed8112a18327 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1133.643996] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1133.643996] env[62277]: value = "task-1405351" [ 1133.643996] env[62277]: _type = "Task" [ 1133.643996] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1133.651207] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405351, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1134.154484] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405351, 'name': CreateVM_Task, 'duration_secs': 0.290619} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1134.154484] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1134.157022] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1134.157022] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1134.157022] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1134.157022] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9538b773-5b14-4e4a-bf18-fefe30def217 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1134.160330] env[62277]: DEBUG oslo_vmware.api [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Waiting for the task: (returnval){ [ 1134.160330] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52c40466-9a85-13ec-472c-c4f6a8749a4c" [ 1134.160330] env[62277]: _type = "Task" [ 1134.160330] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1134.168391] env[62277]: DEBUG oslo_vmware.api [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52c40466-9a85-13ec-472c-c4f6a8749a4c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1134.670854] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1134.670854] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1134.671176] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1135.420542] env[62277]: DEBUG nova.compute.manager [req-f9a9f402-cbac-48f8-bc71-94ea5512adde req-8d253718-cac7-471b-a21f-882683511476 service nova] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Received event network-changed-53176b93-dde5-451c-a76f-a13e68157ef7 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1135.420740] env[62277]: DEBUG nova.compute.manager [req-f9a9f402-cbac-48f8-bc71-94ea5512adde req-8d253718-cac7-471b-a21f-882683511476 service nova] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Refreshing instance network info cache due to event network-changed-53176b93-dde5-451c-a76f-a13e68157ef7. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1135.420951] env[62277]: DEBUG oslo_concurrency.lockutils [req-f9a9f402-cbac-48f8-bc71-94ea5512adde req-8d253718-cac7-471b-a21f-882683511476 service nova] Acquiring lock "refresh_cache-930ff058-ab48-4c8a-8f5e-4820a3b12d50" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1135.421140] env[62277]: DEBUG oslo_concurrency.lockutils [req-f9a9f402-cbac-48f8-bc71-94ea5512adde req-8d253718-cac7-471b-a21f-882683511476 service nova] Acquired lock "refresh_cache-930ff058-ab48-4c8a-8f5e-4820a3b12d50" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1135.421258] env[62277]: DEBUG nova.network.neutron [req-f9a9f402-cbac-48f8-bc71-94ea5512adde req-8d253718-cac7-471b-a21f-882683511476 service nova] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Refreshing network info cache for port 53176b93-dde5-451c-a76f-a13e68157ef7 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1135.772202] env[62277]: DEBUG nova.network.neutron [req-f9a9f402-cbac-48f8-bc71-94ea5512adde req-8d253718-cac7-471b-a21f-882683511476 service nova] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Updated VIF entry in instance network info cache for port 53176b93-dde5-451c-a76f-a13e68157ef7. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1135.772621] env[62277]: DEBUG nova.network.neutron [req-f9a9f402-cbac-48f8-bc71-94ea5512adde req-8d253718-cac7-471b-a21f-882683511476 service nova] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Updating instance_info_cache with network_info: [{"id": "53176b93-dde5-451c-a76f-a13e68157ef7", "address": "fa:16:3e:34:06:74", "network": {"id": "2d9b4a6b-c2ff-44b3-b1cf-d86c844eba30", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-284483849-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fdf9c637644943bfa8f2c698cdfaa268", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a4307c18-b235-43cd-bcd5-e226012d8ee9", "external-id": "nsx-vlan-transportzone-867", "segmentation_id": 867, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap53176b93-dd", "ovs_interfaceid": "53176b93-dde5-451c-a76f-a13e68157ef7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1135.781929] env[62277]: DEBUG oslo_concurrency.lockutils [req-f9a9f402-cbac-48f8-bc71-94ea5512adde req-8d253718-cac7-471b-a21f-882683511476 service nova] Releasing lock "refresh_cache-930ff058-ab48-4c8a-8f5e-4820a3b12d50" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1155.170443] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1155.170704] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 1155.170798] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 1155.191528] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1155.191688] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1155.191822] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1155.191949] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1155.192090] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1155.192221] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1155.192342] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1155.192457] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1155.192571] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1155.192722] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1155.192846] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 1155.193356] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1156.168311] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1157.168561] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1158.163970] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1159.169025] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1159.169025] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1159.169025] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1159.180774] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1159.181055] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1159.181826] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1159.181826] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1159.182548] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdd2a27a-93fc-45b1-94e8-1a3b1de24fcb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1159.193046] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57391193-4e2c-4062-83bb-cc1894e3ca65 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1159.206913] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f97bfcd2-dfd8-4dbd-b289-e0d95ffb8260 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1159.213532] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a465632-1fad-43c7-9f97-fd5998ce0a5d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1159.246073] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181456MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1159.246241] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1159.246432] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1159.318474] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance d68ccb50-a04d-4e59-8161-f01305eb81a8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1159.318634] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 19d15611-315f-4c4b-8f32-e5d00d0d8ca8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1159.318762] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance dfc291fd-1481-4e76-9fb3-ec87124c1281 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1159.318885] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 36ff1435-1999-4e95-8920-81a1b25cc452 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1159.319013] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 68925f1b-da69-4955-acb1-d6500b03daee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1159.319137] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 866c4415-caab-4d81-86ba-ed662feb3c4f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1159.319252] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 350e2302-66b9-4dd6-b0f4-77000992408b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1159.319365] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 23bc5a48-3e96-4897-bf28-ad14a0bdde62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1159.319477] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 346748bd-b4e8-4e93-b71d-66c90a45e372 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1159.319590] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 930ff058-ab48-4c8a-8f5e-4820a3b12d50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1159.333313] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 154fa64c-55d4-4b72-8af9-39e72fd5df5f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1159.344517] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1159.371897] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 79177b7a-1bf6-4649-80c7-4ba2c6cda0ad has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1159.382789] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4cc3443d-3d05-4d3b-a222-6b7367c1c989 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1159.399893] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 04122dba-4cbf-4176-95bd-f1c4bfaa799e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1159.409736] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 30d7d279-7241-4f2e-b963-d205a5f9fa41 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1159.419737] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance fb905507-549f-4265-9851-4a42930c02a4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1159.429454] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4f5018a5-26f5-4a31-8a9d-d2557c906995 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1159.440351] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c01ad807-a9a5-4028-baf4-0469a6301459 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1159.449635] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 1e8429b2-7149-4832-8590-e0ebd8501176 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1159.460686] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance f646a534-1ae8-40dd-9819-3d71bda87ae2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1159.473379] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 2b98866e-3c86-47bd-9eff-2c2743631563 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1159.482862] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 560b7750-03fe-4a4c-ab1d-a1751895986b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1159.493061] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 86e8e8ba-e476-400d-b180-bb7df8a042d8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1159.503282] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c3e72352-f795-4ce7-9e0b-4e80c4329f7b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1159.516194] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance b6908c32-5916-4a0e-92e2-21f480c5f7ca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1159.525968] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 27cd13ca-a17c-476e-a00a-cca1fe898763 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1159.535900] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c000e183-2e57-470e-a9a5-30b5899e77c1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1159.545827] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 21bd4623-2b46-43c4-859f-c4d3bf261e1f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1159.546067] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1159.546222] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1159.890907] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db7c36de-5197-412c-9fd3-0f2806df2a99 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1159.898835] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cffff5a-a6bd-4dc0-8a85-b2707f80cab6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1159.928570] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8595824f-2e28-4140-9842-d6813b1f7f17 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1159.936291] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8696a146-e899-4aed-ba67-4b7ead497426 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1159.950135] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1159.960876] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1159.976264] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1159.976450] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.730s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1163.975905] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1163.975905] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 1178.317619] env[62277]: WARNING oslo_vmware.rw_handles [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1178.317619] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1178.317619] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1178.317619] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1178.317619] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1178.317619] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1178.317619] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1178.317619] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1178.317619] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1178.317619] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1178.317619] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1178.317619] env[62277]: ERROR oslo_vmware.rw_handles [ 1178.317619] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/01a403d3-d766-4b1a-b3e8-db65ab3554ac/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1178.318785] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1178.319199] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Copying Virtual Disk [datastore2] vmware_temp/01a403d3-d766-4b1a-b3e8-db65ab3554ac/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/01a403d3-d766-4b1a-b3e8-db65ab3554ac/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1178.319518] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ce6a7995-a994-48c7-a48e-55610fb30e54 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.328387] env[62277]: DEBUG oslo_vmware.api [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Waiting for the task: (returnval){ [ 1178.328387] env[62277]: value = "task-1405352" [ 1178.328387] env[62277]: _type = "Task" [ 1178.328387] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1178.336479] env[62277]: DEBUG oslo_vmware.api [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Task: {'id': task-1405352, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1178.838075] env[62277]: DEBUG oslo_vmware.exceptions [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1178.838386] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1178.838923] env[62277]: ERROR nova.compute.manager [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1178.838923] env[62277]: Faults: ['InvalidArgument'] [ 1178.838923] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Traceback (most recent call last): [ 1178.838923] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1178.838923] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] yield resources [ 1178.838923] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1178.838923] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] self.driver.spawn(context, instance, image_meta, [ 1178.838923] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1178.838923] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1178.838923] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1178.838923] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] self._fetch_image_if_missing(context, vi) [ 1178.838923] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1178.839414] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] image_cache(vi, tmp_image_ds_loc) [ 1178.839414] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1178.839414] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] vm_util.copy_virtual_disk( [ 1178.839414] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1178.839414] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] session._wait_for_task(vmdk_copy_task) [ 1178.839414] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1178.839414] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] return self.wait_for_task(task_ref) [ 1178.839414] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1178.839414] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] return evt.wait() [ 1178.839414] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1178.839414] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] result = hub.switch() [ 1178.839414] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1178.839414] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] return self.greenlet.switch() [ 1178.839805] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1178.839805] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] self.f(*self.args, **self.kw) [ 1178.839805] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1178.839805] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] raise exceptions.translate_fault(task_info.error) [ 1178.839805] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1178.839805] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Faults: ['InvalidArgument'] [ 1178.839805] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] [ 1178.839805] env[62277]: INFO nova.compute.manager [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Terminating instance [ 1178.840818] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1178.841012] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1178.841512] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c418f69b-5b5e-4880-b21d-cd5a9dc2befa {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.843847] env[62277]: DEBUG nova.compute.manager [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1178.844121] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1178.844882] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b203513-a425-4eee-ae55-2d3e0b31c1d4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.852771] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1178.852867] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-81475d31-a9e5-46d3-8ebc-b092a3cf0b4a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.855275] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1178.856056] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1178.856402] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f80e72c5-6531-44db-8f32-fceb7a3831f1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.861016] env[62277]: DEBUG oslo_vmware.api [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Waiting for the task: (returnval){ [ 1178.861016] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52ac26cd-5e8b-03da-94c2-5beea7f6eca3" [ 1178.861016] env[62277]: _type = "Task" [ 1178.861016] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1178.869119] env[62277]: DEBUG oslo_vmware.api [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52ac26cd-5e8b-03da-94c2-5beea7f6eca3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1178.930097] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1178.930322] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1178.930526] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Deleting the datastore file [datastore2] d68ccb50-a04d-4e59-8161-f01305eb81a8 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1178.930929] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9adf1edf-71fc-48bd-bcd4-108724963fd1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1178.937690] env[62277]: DEBUG oslo_vmware.api [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Waiting for the task: (returnval){ [ 1178.937690] env[62277]: value = "task-1405354" [ 1178.937690] env[62277]: _type = "Task" [ 1178.937690] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1178.946524] env[62277]: DEBUG oslo_vmware.api [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Task: {'id': task-1405354, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1179.371753] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1179.372032] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Creating directory with path [datastore2] vmware_temp/6a1cbc53-a14e-495a-bca1-8725587c770c/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1179.372742] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f48d331e-1a45-4410-9def-bfa0682ec1ba {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.383924] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Created directory with path [datastore2] vmware_temp/6a1cbc53-a14e-495a-bca1-8725587c770c/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1179.383924] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Fetch image to [datastore2] vmware_temp/6a1cbc53-a14e-495a-bca1-8725587c770c/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1179.383924] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/6a1cbc53-a14e-495a-bca1-8725587c770c/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1179.384680] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7266f307-d89c-46e2-b592-cf923a945884 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.391961] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-001ef597-869c-4c70-9a6e-9d32c4448ff0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.400194] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82a2cdbd-6a52-4b0e-ba9b-bc620dd05616 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.430929] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73934c04-d12f-4f11-9cb7-daa51750d1c9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.436526] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ca6fb741-5a06-4305-bead-cb96be55b17e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.445719] env[62277]: DEBUG oslo_vmware.api [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Task: {'id': task-1405354, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074326} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1179.445964] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1179.446212] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1179.446396] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1179.446569] env[62277]: INFO nova.compute.manager [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1179.448684] env[62277]: DEBUG nova.compute.claims [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1179.448846] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1179.449111] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1179.458817] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1179.520075] env[62277]: DEBUG oslo_vmware.rw_handles [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6a1cbc53-a14e-495a-bca1-8725587c770c/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1179.584193] env[62277]: DEBUG oslo_vmware.rw_handles [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1179.584471] env[62277]: DEBUG oslo_vmware.rw_handles [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6a1cbc53-a14e-495a-bca1-8725587c770c/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1179.869495] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-429a4458-0d1f-424c-9595-02b5e99a8e2b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.877314] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-654dcd2a-dbb4-4bf4-9f5a-829d18191125 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.906539] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-552b5720-7fd2-43a2-89ee-e23843073155 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.913450] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2823c0f-76c1-44b3-b968-799562997a71 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1179.926307] env[62277]: DEBUG nova.compute.provider_tree [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1179.934902] env[62277]: DEBUG nova.scheduler.client.report [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1179.950677] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.501s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1179.951299] env[62277]: ERROR nova.compute.manager [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1179.951299] env[62277]: Faults: ['InvalidArgument'] [ 1179.951299] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Traceback (most recent call last): [ 1179.951299] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1179.951299] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] self.driver.spawn(context, instance, image_meta, [ 1179.951299] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1179.951299] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1179.951299] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1179.951299] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] self._fetch_image_if_missing(context, vi) [ 1179.951299] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1179.951299] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] image_cache(vi, tmp_image_ds_loc) [ 1179.951299] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1179.951665] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] vm_util.copy_virtual_disk( [ 1179.951665] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1179.951665] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] session._wait_for_task(vmdk_copy_task) [ 1179.951665] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1179.951665] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] return self.wait_for_task(task_ref) [ 1179.951665] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1179.951665] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] return evt.wait() [ 1179.951665] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1179.951665] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] result = hub.switch() [ 1179.951665] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1179.951665] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] return self.greenlet.switch() [ 1179.951665] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1179.951665] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] self.f(*self.args, **self.kw) [ 1179.952017] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1179.952017] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] raise exceptions.translate_fault(task_info.error) [ 1179.952017] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1179.952017] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Faults: ['InvalidArgument'] [ 1179.952017] env[62277]: ERROR nova.compute.manager [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] [ 1179.952017] env[62277]: DEBUG nova.compute.utils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1179.953610] env[62277]: DEBUG nova.compute.manager [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Build of instance d68ccb50-a04d-4e59-8161-f01305eb81a8 was re-scheduled: A specified parameter was not correct: fileType [ 1179.953610] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1179.953987] env[62277]: DEBUG nova.compute.manager [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1179.954167] env[62277]: DEBUG nova.compute.manager [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1179.954339] env[62277]: DEBUG nova.compute.manager [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1179.954507] env[62277]: DEBUG nova.network.neutron [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1180.321627] env[62277]: DEBUG nova.network.neutron [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1180.334382] env[62277]: INFO nova.compute.manager [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] [instance: d68ccb50-a04d-4e59-8161-f01305eb81a8] Took 0.38 seconds to deallocate network for instance. [ 1180.448421] env[62277]: INFO nova.scheduler.client.report [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Deleted allocations for instance d68ccb50-a04d-4e59-8161-f01305eb81a8 [ 1180.477751] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f1ba06da-bcdb-43db-825a-3eeab3051e35 tempest-ImagesOneServerTestJSON-637229228 tempest-ImagesOneServerTestJSON-637229228-project-member] Lock "d68ccb50-a04d-4e59-8161-f01305eb81a8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 193.353s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1180.490725] env[62277]: DEBUG nova.compute.manager [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1180.557870] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1180.558233] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1180.559749] env[62277]: INFO nova.compute.claims [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1180.917918] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e49de81f-488a-40f1-98e6-8d200e81e57e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1180.925790] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d95cd6d7-31cd-458d-84d2-ba33ede8cf2f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1180.955821] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffb51bce-dbb1-41a8-a4e1-c78cc56adeb3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1180.963412] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e335990d-a576-41d7-a833-3a03f7435073 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1180.977708] env[62277]: DEBUG nova.compute.provider_tree [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1180.985981] env[62277]: DEBUG nova.scheduler.client.report [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1181.000714] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.442s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1181.001099] env[62277]: DEBUG nova.compute.manager [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1181.036818] env[62277]: DEBUG nova.compute.utils [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1181.037945] env[62277]: DEBUG nova.compute.manager [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1181.038121] env[62277]: DEBUG nova.network.neutron [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1181.047889] env[62277]: DEBUG nova.compute.manager [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1181.080195] env[62277]: INFO nova.virt.block_device [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Booting with volume 8127ec42-b0f5-4086-b761-68b728ea559b at /dev/sda [ 1181.105381] env[62277]: DEBUG nova.policy [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4026924743874d7da44fc4b8da2c46c1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9b5a89a9f42e47f18fe9ad00e539bdce', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1181.127928] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9a47974c-5fd1-48a6-83a3-58ac997c9ded {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1181.136511] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0c2338a-3408-4f25-99ca-5a128799418e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1181.165943] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-cfe22dd4-89ad-49d1-b459-e09feb2aa11a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1181.173545] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55376550-647e-4de3-8146-d9caa3fbf652 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1181.202290] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0aae57f8-c52f-434f-b5be-6a753dea7197 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1181.208796] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44f5fb93-85df-4a38-b814-6419bad32ce4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1181.222866] env[62277]: DEBUG nova.virt.block_device [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Updating existing volume attachment record: 384405f9-47a4-4939-8317-55b73fa157b1 {{(pid=62277) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 1181.464768] env[62277]: DEBUG nova.compute.manager [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1181.465328] env[62277]: DEBUG nova.virt.hardware [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1181.465624] env[62277]: DEBUG nova.virt.hardware [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1181.465688] env[62277]: DEBUG nova.virt.hardware [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1181.465878] env[62277]: DEBUG nova.virt.hardware [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1181.466027] env[62277]: DEBUG nova.virt.hardware [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1181.466177] env[62277]: DEBUG nova.virt.hardware [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1181.466375] env[62277]: DEBUG nova.virt.hardware [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1181.466525] env[62277]: DEBUG nova.virt.hardware [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1181.466687] env[62277]: DEBUG nova.virt.hardware [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1181.466844] env[62277]: DEBUG nova.virt.hardware [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1181.467015] env[62277]: DEBUG nova.virt.hardware [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1181.468126] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a905305-2427-422c-95dc-ee9a7cc40faf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1181.477388] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99ee74c0-825f-4da2-9e8d-ea6e42a6aaa6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1181.741428] env[62277]: DEBUG nova.network.neutron [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Successfully created port: 09dcbb3a-90a6-4559-be66-b51fd18ff545 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1182.554659] env[62277]: DEBUG nova.compute.manager [req-6e4bcd3f-e664-4afe-a93b-38ae470cf003 req-531da914-45be-4c67-855b-01f97d6b1d72 service nova] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Received event network-vif-plugged-09dcbb3a-90a6-4559-be66-b51fd18ff545 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1182.554659] env[62277]: DEBUG oslo_concurrency.lockutils [req-6e4bcd3f-e664-4afe-a93b-38ae470cf003 req-531da914-45be-4c67-855b-01f97d6b1d72 service nova] Acquiring lock "154fa64c-55d4-4b72-8af9-39e72fd5df5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1182.554659] env[62277]: DEBUG oslo_concurrency.lockutils [req-6e4bcd3f-e664-4afe-a93b-38ae470cf003 req-531da914-45be-4c67-855b-01f97d6b1d72 service nova] Lock "154fa64c-55d4-4b72-8af9-39e72fd5df5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1182.554659] env[62277]: DEBUG oslo_concurrency.lockutils [req-6e4bcd3f-e664-4afe-a93b-38ae470cf003 req-531da914-45be-4c67-855b-01f97d6b1d72 service nova] Lock "154fa64c-55d4-4b72-8af9-39e72fd5df5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1182.555059] env[62277]: DEBUG nova.compute.manager [req-6e4bcd3f-e664-4afe-a93b-38ae470cf003 req-531da914-45be-4c67-855b-01f97d6b1d72 service nova] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] No waiting events found dispatching network-vif-plugged-09dcbb3a-90a6-4559-be66-b51fd18ff545 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1182.555059] env[62277]: WARNING nova.compute.manager [req-6e4bcd3f-e664-4afe-a93b-38ae470cf003 req-531da914-45be-4c67-855b-01f97d6b1d72 service nova] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Received unexpected event network-vif-plugged-09dcbb3a-90a6-4559-be66-b51fd18ff545 for instance with vm_state building and task_state spawning. [ 1182.571665] env[62277]: DEBUG nova.network.neutron [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Successfully updated port: 09dcbb3a-90a6-4559-be66-b51fd18ff545 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1182.585079] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Acquiring lock "refresh_cache-154fa64c-55d4-4b72-8af9-39e72fd5df5f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1182.586704] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Acquired lock "refresh_cache-154fa64c-55d4-4b72-8af9-39e72fd5df5f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1182.586876] env[62277]: DEBUG nova.network.neutron [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1182.644514] env[62277]: DEBUG nova.network.neutron [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1182.868832] env[62277]: DEBUG nova.network.neutron [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Updating instance_info_cache with network_info: [{"id": "09dcbb3a-90a6-4559-be66-b51fd18ff545", "address": "fa:16:3e:4c:96:ea", "network": {"id": "2aecd07a-f859-416d-ab4b-42b5ba108266", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-154506376-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9b5a89a9f42e47f18fe9ad00e539bdce", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dba18786-598d-4e06-96db-b3dc1717530f", "external-id": "nsx-vlan-transportzone-741", "segmentation_id": 741, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap09dcbb3a-90", "ovs_interfaceid": "09dcbb3a-90a6-4559-be66-b51fd18ff545", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1182.889375] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Releasing lock "refresh_cache-154fa64c-55d4-4b72-8af9-39e72fd5df5f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1182.889687] env[62277]: DEBUG nova.compute.manager [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Instance network_info: |[{"id": "09dcbb3a-90a6-4559-be66-b51fd18ff545", "address": "fa:16:3e:4c:96:ea", "network": {"id": "2aecd07a-f859-416d-ab4b-42b5ba108266", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-154506376-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9b5a89a9f42e47f18fe9ad00e539bdce", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dba18786-598d-4e06-96db-b3dc1717530f", "external-id": "nsx-vlan-transportzone-741", "segmentation_id": 741, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap09dcbb3a-90", "ovs_interfaceid": "09dcbb3a-90a6-4559-be66-b51fd18ff545", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1182.890102] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:4c:96:ea', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dba18786-598d-4e06-96db-b3dc1717530f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '09dcbb3a-90a6-4559-be66-b51fd18ff545', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1182.897726] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Creating folder: Project (9b5a89a9f42e47f18fe9ad00e539bdce). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1182.898332] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-33b81383-1c2e-43c0-a38b-b63586aa6389 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1182.913249] env[62277]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 1182.913614] env[62277]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=62277) _invoke_api /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:337}} [ 1182.914041] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Folder already exists: Project (9b5a89a9f42e47f18fe9ad00e539bdce). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 1182.914251] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Creating folder: Instances. Parent ref: group-v297800. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1182.914489] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e3f9099e-9102-4085-a3a1-f179cc91a949 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1182.924064] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Created folder: Instances in parent group-v297800. [ 1182.924303] env[62277]: DEBUG oslo.service.loopingcall [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1182.924491] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1182.924691] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4118b9eb-72a5-49b9-b47b-dfff61604f46 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1182.948936] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1182.948936] env[62277]: value = "task-1405357" [ 1182.948936] env[62277]: _type = "Task" [ 1182.948936] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1182.958139] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405357, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1183.460873] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405357, 'name': CreateVM_Task, 'duration_secs': 0.321612} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1183.460873] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1183.461554] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'boot_index': 0, 'device_type': None, 'disk_bus': None, 'guest_format': None, 'delete_on_termination': True, 'attachment_id': '384405f9-47a4-4939-8317-55b73fa157b1', 'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-297813', 'volume_id': '8127ec42-b0f5-4086-b761-68b728ea559b', 'name': 'volume-8127ec42-b0f5-4086-b761-68b728ea559b', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '154fa64c-55d4-4b72-8af9-39e72fd5df5f', 'attached_at': '', 'detached_at': '', 'volume_id': '8127ec42-b0f5-4086-b761-68b728ea559b', 'serial': '8127ec42-b0f5-4086-b761-68b728ea559b'}, 'mount_device': '/dev/sda', 'volume_type': None}], 'swap': None} {{(pid=62277) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 1183.461699] env[62277]: DEBUG nova.virt.vmwareapi.volumeops [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Root volume attach. Driver type: vmdk {{(pid=62277) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 1183.462430] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b065e6b3-6fbb-469b-a849-703051ac2102 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1183.471255] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01f338db-f536-4909-947e-cfa49c209a85 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1183.477731] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34203ff5-e8d6-49a8-b34b-c6a742cb0faa {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1183.484145] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-c93b3671-673a-49f5-b9c7-45cb107baf27 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1183.491094] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Waiting for the task: (returnval){ [ 1183.491094] env[62277]: value = "task-1405358" [ 1183.491094] env[62277]: _type = "Task" [ 1183.491094] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1183.498718] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405358, 'name': RelocateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1184.003207] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405358, 'name': RelocateVM_Task} progress is 40%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1184.502905] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405358, 'name': RelocateVM_Task} progress is 54%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1184.683014] env[62277]: DEBUG nova.compute.manager [req-90b65ca0-585c-4acd-9b09-b446ace93b37 req-ed4d393d-65cc-495e-966f-0a6d55f5d683 service nova] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Received event network-changed-09dcbb3a-90a6-4559-be66-b51fd18ff545 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1184.683393] env[62277]: DEBUG nova.compute.manager [req-90b65ca0-585c-4acd-9b09-b446ace93b37 req-ed4d393d-65cc-495e-966f-0a6d55f5d683 service nova] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Refreshing instance network info cache due to event network-changed-09dcbb3a-90a6-4559-be66-b51fd18ff545. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1184.683768] env[62277]: DEBUG oslo_concurrency.lockutils [req-90b65ca0-585c-4acd-9b09-b446ace93b37 req-ed4d393d-65cc-495e-966f-0a6d55f5d683 service nova] Acquiring lock "refresh_cache-154fa64c-55d4-4b72-8af9-39e72fd5df5f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1184.683990] env[62277]: DEBUG oslo_concurrency.lockutils [req-90b65ca0-585c-4acd-9b09-b446ace93b37 req-ed4d393d-65cc-495e-966f-0a6d55f5d683 service nova] Acquired lock "refresh_cache-154fa64c-55d4-4b72-8af9-39e72fd5df5f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1184.684255] env[62277]: DEBUG nova.network.neutron [req-90b65ca0-585c-4acd-9b09-b446ace93b37 req-ed4d393d-65cc-495e-966f-0a6d55f5d683 service nova] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Refreshing network info cache for port 09dcbb3a-90a6-4559-be66-b51fd18ff545 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1185.004361] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405358, 'name': RelocateVM_Task} progress is 67%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1185.424908] env[62277]: DEBUG nova.network.neutron [req-90b65ca0-585c-4acd-9b09-b446ace93b37 req-ed4d393d-65cc-495e-966f-0a6d55f5d683 service nova] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Updated VIF entry in instance network info cache for port 09dcbb3a-90a6-4559-be66-b51fd18ff545. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1185.425345] env[62277]: DEBUG nova.network.neutron [req-90b65ca0-585c-4acd-9b09-b446ace93b37 req-ed4d393d-65cc-495e-966f-0a6d55f5d683 service nova] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Updating instance_info_cache with network_info: [{"id": "09dcbb3a-90a6-4559-be66-b51fd18ff545", "address": "fa:16:3e:4c:96:ea", "network": {"id": "2aecd07a-f859-416d-ab4b-42b5ba108266", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-154506376-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9b5a89a9f42e47f18fe9ad00e539bdce", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dba18786-598d-4e06-96db-b3dc1717530f", "external-id": "nsx-vlan-transportzone-741", "segmentation_id": 741, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap09dcbb3a-90", "ovs_interfaceid": "09dcbb3a-90a6-4559-be66-b51fd18ff545", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1185.439096] env[62277]: DEBUG oslo_concurrency.lockutils [req-90b65ca0-585c-4acd-9b09-b446ace93b37 req-ed4d393d-65cc-495e-966f-0a6d55f5d683 service nova] Releasing lock "refresh_cache-154fa64c-55d4-4b72-8af9-39e72fd5df5f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1185.506940] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405358, 'name': RelocateVM_Task} progress is 82%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1186.005850] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405358, 'name': RelocateVM_Task} progress is 97%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1186.506260] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405358, 'name': RelocateVM_Task} progress is 98%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1187.006579] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405358, 'name': RelocateVM_Task, 'duration_secs': 3.1664} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1187.006872] env[62277]: DEBUG nova.virt.vmwareapi.volumeops [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Volume attach. Driver type: vmdk {{(pid=62277) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 1187.007076] env[62277]: DEBUG nova.virt.vmwareapi.volumeops [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-297813', 'volume_id': '8127ec42-b0f5-4086-b761-68b728ea559b', 'name': 'volume-8127ec42-b0f5-4086-b761-68b728ea559b', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '154fa64c-55d4-4b72-8af9-39e72fd5df5f', 'attached_at': '', 'detached_at': '', 'volume_id': '8127ec42-b0f5-4086-b761-68b728ea559b', 'serial': '8127ec42-b0f5-4086-b761-68b728ea559b'} {{(pid=62277) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 1187.007836] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-552afd39-92f3-4741-bc22-b4cd4895be5c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1187.031129] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0e6fdcd-d48a-43eb-9b78-e8ee473944db {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1187.055954] env[62277]: DEBUG nova.virt.vmwareapi.volumeops [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Reconfiguring VM instance instance-00000010 to attach disk [datastore2] volume-8127ec42-b0f5-4086-b761-68b728ea559b/volume-8127ec42-b0f5-4086-b761-68b728ea559b.vmdk or device None with type thin {{(pid=62277) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 1187.055954] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-71287b34-b53a-4243-872a-dba428fd2ca7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1187.080018] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Waiting for the task: (returnval){ [ 1187.080018] env[62277]: value = "task-1405359" [ 1187.080018] env[62277]: _type = "Task" [ 1187.080018] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1187.092473] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405359, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1187.592116] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405359, 'name': ReconfigVM_Task, 'duration_secs': 0.319278} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1187.592741] env[62277]: DEBUG nova.virt.vmwareapi.volumeops [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Reconfigured VM instance instance-00000010 to attach disk [datastore2] volume-8127ec42-b0f5-4086-b761-68b728ea559b/volume-8127ec42-b0f5-4086-b761-68b728ea559b.vmdk or device None with type thin {{(pid=62277) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 1187.597345] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-b0832608-7c3b-4ec2-9696-08a13a1d5759 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1187.613408] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Waiting for the task: (returnval){ [ 1187.613408] env[62277]: value = "task-1405360" [ 1187.613408] env[62277]: _type = "Task" [ 1187.613408] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1187.623085] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405360, 'name': ReconfigVM_Task} progress is 6%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1188.123572] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405360, 'name': ReconfigVM_Task, 'duration_secs': 0.117355} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1188.123932] env[62277]: DEBUG nova.virt.vmwareapi.volumeops [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-297813', 'volume_id': '8127ec42-b0f5-4086-b761-68b728ea559b', 'name': 'volume-8127ec42-b0f5-4086-b761-68b728ea559b', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '154fa64c-55d4-4b72-8af9-39e72fd5df5f', 'attached_at': '', 'detached_at': '', 'volume_id': '8127ec42-b0f5-4086-b761-68b728ea559b', 'serial': '8127ec42-b0f5-4086-b761-68b728ea559b'} {{(pid=62277) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 1188.124553] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-3472bee6-cc68-444b-aec8-99d1d345fc20 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.131275] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Waiting for the task: (returnval){ [ 1188.131275] env[62277]: value = "task-1405361" [ 1188.131275] env[62277]: _type = "Task" [ 1188.131275] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1188.139932] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405361, 'name': Rename_Task} progress is 5%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1188.641766] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405361, 'name': Rename_Task, 'duration_secs': 0.11869} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1188.642057] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Powering on the VM {{(pid=62277) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 1188.642310] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-d3eeeadf-2d14-4698-b418-6a2ef883741e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.649338] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Waiting for the task: (returnval){ [ 1188.649338] env[62277]: value = "task-1405362" [ 1188.649338] env[62277]: _type = "Task" [ 1188.649338] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1188.658109] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405362, 'name': PowerOnVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1189.159729] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405362, 'name': PowerOnVM_Task} progress is 66%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1189.659572] env[62277]: DEBUG oslo_vmware.api [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405362, 'name': PowerOnVM_Task, 'duration_secs': 0.986225} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1189.659835] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Powered on the VM {{(pid=62277) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 1189.660145] env[62277]: INFO nova.compute.manager [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Took 8.20 seconds to spawn the instance on the hypervisor. [ 1189.660454] env[62277]: DEBUG nova.compute.manager [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Checking state {{(pid=62277) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} [ 1189.661266] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6007f351-c89c-4810-a0a5-539938acb12f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1189.722874] env[62277]: INFO nova.compute.manager [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Took 9.18 seconds to build instance. [ 1189.742315] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b7e700fc-70b6-456e-9a1d-7843e0910d45 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Lock "154fa64c-55d4-4b72-8af9-39e72fd5df5f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 166.715s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1189.758020] env[62277]: DEBUG nova.compute.manager [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1189.817024] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1189.817024] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1189.817024] env[62277]: INFO nova.compute.claims [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1190.205897] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e662c283-a3ea-47f6-8713-3345115474e1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.213520] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84cea11b-0441-420c-a81e-8d55a91426ff {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.243546] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24d8fd48-dbd8-4ce0-b462-173466024861 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.251019] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c46876c-3e3e-4b7f-a09a-fd18ee14ed74 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.264032] env[62277]: DEBUG nova.compute.provider_tree [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1190.270409] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "6d759045-e1fc-43ea-a882-1ead769b6d29" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1190.271272] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "6d759045-e1fc-43ea-a882-1ead769b6d29" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1190.272427] env[62277]: DEBUG nova.scheduler.client.report [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1190.287688] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.472s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1190.288211] env[62277]: DEBUG nova.compute.manager [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1190.330550] env[62277]: DEBUG nova.compute.utils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1190.331822] env[62277]: DEBUG nova.compute.manager [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1190.332019] env[62277]: DEBUG nova.network.neutron [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1190.341900] env[62277]: DEBUG nova.compute.manager [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1190.416979] env[62277]: DEBUG nova.compute.manager [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1190.431998] env[62277]: DEBUG nova.policy [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '69537d4d5bdc4140941598f27c8ac31a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd0d30b51245940e8a538b261979633b2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1190.441805] env[62277]: DEBUG nova.virt.hardware [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1190.442044] env[62277]: DEBUG nova.virt.hardware [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1190.442205] env[62277]: DEBUG nova.virt.hardware [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1190.442472] env[62277]: DEBUG nova.virt.hardware [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1190.442519] env[62277]: DEBUG nova.virt.hardware [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1190.442662] env[62277]: DEBUG nova.virt.hardware [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1190.442842] env[62277]: DEBUG nova.virt.hardware [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1190.442996] env[62277]: DEBUG nova.virt.hardware [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1190.443269] env[62277]: DEBUG nova.virt.hardware [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1190.443440] env[62277]: DEBUG nova.virt.hardware [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1190.443610] env[62277]: DEBUG nova.virt.hardware [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1190.444461] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f159083f-fd0a-43ec-b6b0-286f8df89296 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.453120] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4563ccdb-caff-42ed-9579-f1115a1138c3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1191.028025] env[62277]: DEBUG nova.network.neutron [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Successfully created port: 9c8ca84a-b847-459d-9f41-aa4c968f1222 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1191.482237] env[62277]: DEBUG nova.network.neutron [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Successfully created port: ce3cacf5-c77f-46ad-9c46-843407134626 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1191.902478] env[62277]: DEBUG nova.network.neutron [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Successfully created port: 78db1554-8dd0-4284-a010-10c296868f4e {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1192.159242] env[62277]: DEBUG nova.compute.manager [req-67d91ea6-d2d5-4c78-acbe-c2500669f0f4 req-dd95954c-c973-4663-9f6d-7dd3628cf464 service nova] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Received event network-changed-09dcbb3a-90a6-4559-be66-b51fd18ff545 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1192.159490] env[62277]: DEBUG nova.compute.manager [req-67d91ea6-d2d5-4c78-acbe-c2500669f0f4 req-dd95954c-c973-4663-9f6d-7dd3628cf464 service nova] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Refreshing instance network info cache due to event network-changed-09dcbb3a-90a6-4559-be66-b51fd18ff545. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1192.159773] env[62277]: DEBUG oslo_concurrency.lockutils [req-67d91ea6-d2d5-4c78-acbe-c2500669f0f4 req-dd95954c-c973-4663-9f6d-7dd3628cf464 service nova] Acquiring lock "refresh_cache-154fa64c-55d4-4b72-8af9-39e72fd5df5f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1192.159957] env[62277]: DEBUG oslo_concurrency.lockutils [req-67d91ea6-d2d5-4c78-acbe-c2500669f0f4 req-dd95954c-c973-4663-9f6d-7dd3628cf464 service nova] Acquired lock "refresh_cache-154fa64c-55d4-4b72-8af9-39e72fd5df5f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1192.160183] env[62277]: DEBUG nova.network.neutron [req-67d91ea6-d2d5-4c78-acbe-c2500669f0f4 req-dd95954c-c973-4663-9f6d-7dd3628cf464 service nova] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Refreshing network info cache for port 09dcbb3a-90a6-4559-be66-b51fd18ff545 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1192.466009] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b6b53b76-79f6-467b-a6d4-2655043bf5de tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Acquiring lock "19d15611-315f-4c4b-8f32-e5d00d0d8ca8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1192.771883] env[62277]: DEBUG nova.network.neutron [req-67d91ea6-d2d5-4c78-acbe-c2500669f0f4 req-dd95954c-c973-4663-9f6d-7dd3628cf464 service nova] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Updated VIF entry in instance network info cache for port 09dcbb3a-90a6-4559-be66-b51fd18ff545. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1192.772149] env[62277]: DEBUG nova.network.neutron [req-67d91ea6-d2d5-4c78-acbe-c2500669f0f4 req-dd95954c-c973-4663-9f6d-7dd3628cf464 service nova] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Updating instance_info_cache with network_info: [{"id": "09dcbb3a-90a6-4559-be66-b51fd18ff545", "address": "fa:16:3e:4c:96:ea", "network": {"id": "2aecd07a-f859-416d-ab4b-42b5ba108266", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-154506376-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9b5a89a9f42e47f18fe9ad00e539bdce", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dba18786-598d-4e06-96db-b3dc1717530f", "external-id": "nsx-vlan-transportzone-741", "segmentation_id": 741, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap09dcbb3a-90", "ovs_interfaceid": "09dcbb3a-90a6-4559-be66-b51fd18ff545", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1192.785694] env[62277]: DEBUG oslo_concurrency.lockutils [req-67d91ea6-d2d5-4c78-acbe-c2500669f0f4 req-dd95954c-c973-4663-9f6d-7dd3628cf464 service nova] Releasing lock "refresh_cache-154fa64c-55d4-4b72-8af9-39e72fd5df5f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1192.823663] env[62277]: DEBUG nova.network.neutron [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Successfully updated port: 9c8ca84a-b847-459d-9f41-aa4c968f1222 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1193.058313] env[62277]: DEBUG nova.compute.manager [req-18aed44a-a880-43ae-95da-995bf565d1d4 req-a9ac8664-a0ce-4df8-8acd-42b46b9f0087 service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Received event network-vif-plugged-9c8ca84a-b847-459d-9f41-aa4c968f1222 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1193.058588] env[62277]: DEBUG oslo_concurrency.lockutils [req-18aed44a-a880-43ae-95da-995bf565d1d4 req-a9ac8664-a0ce-4df8-8acd-42b46b9f0087 service nova] Acquiring lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1193.058967] env[62277]: DEBUG oslo_concurrency.lockutils [req-18aed44a-a880-43ae-95da-995bf565d1d4 req-a9ac8664-a0ce-4df8-8acd-42b46b9f0087 service nova] Lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1193.059233] env[62277]: DEBUG oslo_concurrency.lockutils [req-18aed44a-a880-43ae-95da-995bf565d1d4 req-a9ac8664-a0ce-4df8-8acd-42b46b9f0087 service nova] Lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1193.059366] env[62277]: DEBUG nova.compute.manager [req-18aed44a-a880-43ae-95da-995bf565d1d4 req-a9ac8664-a0ce-4df8-8acd-42b46b9f0087 service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] No waiting events found dispatching network-vif-plugged-9c8ca84a-b847-459d-9f41-aa4c968f1222 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1193.059579] env[62277]: WARNING nova.compute.manager [req-18aed44a-a880-43ae-95da-995bf565d1d4 req-a9ac8664-a0ce-4df8-8acd-42b46b9f0087 service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Received unexpected event network-vif-plugged-9c8ca84a-b847-459d-9f41-aa4c968f1222 for instance with vm_state building and task_state spawning. [ 1193.627501] env[62277]: DEBUG nova.network.neutron [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Successfully updated port: ce3cacf5-c77f-46ad-9c46-843407134626 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1194.720557] env[62277]: DEBUG nova.network.neutron [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Successfully updated port: 78db1554-8dd0-4284-a010-10c296868f4e {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1194.734346] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Acquiring lock "refresh_cache-3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1194.734493] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Acquired lock "refresh_cache-3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1194.734644] env[62277]: DEBUG nova.network.neutron [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1194.773165] env[62277]: DEBUG nova.network.neutron [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1195.181572] env[62277]: DEBUG nova.compute.manager [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Received event network-changed-9c8ca84a-b847-459d-9f41-aa4c968f1222 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1195.181768] env[62277]: DEBUG nova.compute.manager [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Refreshing instance network info cache due to event network-changed-9c8ca84a-b847-459d-9f41-aa4c968f1222. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1195.181952] env[62277]: DEBUG oslo_concurrency.lockutils [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] Acquiring lock "refresh_cache-3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1195.395949] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f6de3ad1-5655-4014-bb0c-f1bdc9de330e tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Acquiring lock "dfc291fd-1481-4e76-9fb3-ec87124c1281" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1195.439844] env[62277]: DEBUG nova.network.neutron [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Updating instance_info_cache with network_info: [{"id": "9c8ca84a-b847-459d-9f41-aa4c968f1222", "address": "fa:16:3e:af:aa:86", "network": {"id": "40af90f0-0e9a-4d37-8f0f-f65081a64caa", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1394444417", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d0d30b51245940e8a538b261979633b2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "55bd18a7-39a8-4d07-9088-9b944f9ff710", "external-id": "nsx-vlan-transportzone-686", "segmentation_id": 686, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9c8ca84a-b8", "ovs_interfaceid": "9c8ca84a-b847-459d-9f41-aa4c968f1222", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ce3cacf5-c77f-46ad-9c46-843407134626", "address": "fa:16:3e:98:8f:33", "network": {"id": "816d63ff-718c-4730-b7a6-c4e9fffe2346", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-120807393", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "d0d30b51245940e8a538b261979633b2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46785c9c-8b22-487d-a854-b3e67c5ed1d7", "external-id": "nsx-vlan-transportzone-430", "segmentation_id": 430, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapce3cacf5-c7", "ovs_interfaceid": "ce3cacf5-c77f-46ad-9c46-843407134626", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "78db1554-8dd0-4284-a010-10c296868f4e", "address": "fa:16:3e:e8:14:85", "network": {"id": "40af90f0-0e9a-4d37-8f0f-f65081a64caa", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1394444417", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.144", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d0d30b51245940e8a538b261979633b2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "55bd18a7-39a8-4d07-9088-9b944f9ff710", "external-id": "nsx-vlan-transportzone-686", "segmentation_id": 686, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap78db1554-8d", "ovs_interfaceid": "78db1554-8dd0-4284-a010-10c296868f4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1195.456543] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Releasing lock "refresh_cache-3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1195.456904] env[62277]: DEBUG nova.compute.manager [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Instance network_info: |[{"id": "9c8ca84a-b847-459d-9f41-aa4c968f1222", "address": "fa:16:3e:af:aa:86", "network": {"id": "40af90f0-0e9a-4d37-8f0f-f65081a64caa", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1394444417", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d0d30b51245940e8a538b261979633b2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "55bd18a7-39a8-4d07-9088-9b944f9ff710", "external-id": "nsx-vlan-transportzone-686", "segmentation_id": 686, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9c8ca84a-b8", "ovs_interfaceid": "9c8ca84a-b847-459d-9f41-aa4c968f1222", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ce3cacf5-c77f-46ad-9c46-843407134626", "address": "fa:16:3e:98:8f:33", "network": {"id": "816d63ff-718c-4730-b7a6-c4e9fffe2346", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-120807393", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "d0d30b51245940e8a538b261979633b2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46785c9c-8b22-487d-a854-b3e67c5ed1d7", "external-id": "nsx-vlan-transportzone-430", "segmentation_id": 430, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapce3cacf5-c7", "ovs_interfaceid": "ce3cacf5-c77f-46ad-9c46-843407134626", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "78db1554-8dd0-4284-a010-10c296868f4e", "address": "fa:16:3e:e8:14:85", "network": {"id": "40af90f0-0e9a-4d37-8f0f-f65081a64caa", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1394444417", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.144", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d0d30b51245940e8a538b261979633b2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "55bd18a7-39a8-4d07-9088-9b944f9ff710", "external-id": "nsx-vlan-transportzone-686", "segmentation_id": 686, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap78db1554-8d", "ovs_interfaceid": "78db1554-8dd0-4284-a010-10c296868f4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1195.457231] env[62277]: DEBUG oslo_concurrency.lockutils [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] Acquired lock "refresh_cache-3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1195.457409] env[62277]: DEBUG nova.network.neutron [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Refreshing network info cache for port 9c8ca84a-b847-459d-9f41-aa4c968f1222 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1195.458453] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:af:aa:86', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '55bd18a7-39a8-4d07-9088-9b944f9ff710', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9c8ca84a-b847-459d-9f41-aa4c968f1222', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:98:8f:33', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '46785c9c-8b22-487d-a854-b3e67c5ed1d7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ce3cacf5-c77f-46ad-9c46-843407134626', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:e8:14:85', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '55bd18a7-39a8-4d07-9088-9b944f9ff710', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '78db1554-8dd0-4284-a010-10c296868f4e', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1195.469651] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Creating folder: Project (d0d30b51245940e8a538b261979633b2). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1195.470614] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ea9e8ac6-cc58-409b-917f-4a463f348de4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.484268] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Created folder: Project (d0d30b51245940e8a538b261979633b2) in parent group-v297781. [ 1195.485018] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Creating folder: Instances. Parent ref: group-v297831. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1195.485018] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0455c019-e961-47bd-937d-ad2248feb058 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.493610] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Created folder: Instances in parent group-v297831. [ 1195.493833] env[62277]: DEBUG oslo.service.loopingcall [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1195.494027] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1195.494262] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2bd78970-a11f-4973-b4e1-7e452a22d552 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1195.518374] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1195.518374] env[62277]: value = "task-1405365" [ 1195.518374] env[62277]: _type = "Task" [ 1195.518374] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1195.528009] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405365, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1195.822956] env[62277]: DEBUG nova.network.neutron [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Updated VIF entry in instance network info cache for port 9c8ca84a-b847-459d-9f41-aa4c968f1222. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1195.823524] env[62277]: DEBUG nova.network.neutron [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Updating instance_info_cache with network_info: [{"id": "9c8ca84a-b847-459d-9f41-aa4c968f1222", "address": "fa:16:3e:af:aa:86", "network": {"id": "40af90f0-0e9a-4d37-8f0f-f65081a64caa", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1394444417", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d0d30b51245940e8a538b261979633b2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "55bd18a7-39a8-4d07-9088-9b944f9ff710", "external-id": "nsx-vlan-transportzone-686", "segmentation_id": 686, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9c8ca84a-b8", "ovs_interfaceid": "9c8ca84a-b847-459d-9f41-aa4c968f1222", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ce3cacf5-c77f-46ad-9c46-843407134626", "address": "fa:16:3e:98:8f:33", "network": {"id": "816d63ff-718c-4730-b7a6-c4e9fffe2346", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-120807393", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "d0d30b51245940e8a538b261979633b2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46785c9c-8b22-487d-a854-b3e67c5ed1d7", "external-id": "nsx-vlan-transportzone-430", "segmentation_id": 430, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapce3cacf5-c7", "ovs_interfaceid": "ce3cacf5-c77f-46ad-9c46-843407134626", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "78db1554-8dd0-4284-a010-10c296868f4e", "address": "fa:16:3e:e8:14:85", "network": {"id": "40af90f0-0e9a-4d37-8f0f-f65081a64caa", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1394444417", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.144", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d0d30b51245940e8a538b261979633b2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "55bd18a7-39a8-4d07-9088-9b944f9ff710", "external-id": "nsx-vlan-transportzone-686", "segmentation_id": 686, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap78db1554-8d", "ovs_interfaceid": "78db1554-8dd0-4284-a010-10c296868f4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1195.834753] env[62277]: DEBUG oslo_concurrency.lockutils [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] Releasing lock "refresh_cache-3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1195.834980] env[62277]: DEBUG nova.compute.manager [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Received event network-vif-plugged-ce3cacf5-c77f-46ad-9c46-843407134626 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1195.835192] env[62277]: DEBUG oslo_concurrency.lockutils [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] Acquiring lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1195.835394] env[62277]: DEBUG oslo_concurrency.lockutils [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] Lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1195.835578] env[62277]: DEBUG oslo_concurrency.lockutils [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] Lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1195.835746] env[62277]: DEBUG nova.compute.manager [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] No waiting events found dispatching network-vif-plugged-ce3cacf5-c77f-46ad-9c46-843407134626 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1195.835909] env[62277]: WARNING nova.compute.manager [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Received unexpected event network-vif-plugged-ce3cacf5-c77f-46ad-9c46-843407134626 for instance with vm_state building and task_state spawning. [ 1195.836079] env[62277]: DEBUG nova.compute.manager [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Received event network-changed-ce3cacf5-c77f-46ad-9c46-843407134626 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1195.836238] env[62277]: DEBUG nova.compute.manager [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Refreshing instance network info cache due to event network-changed-ce3cacf5-c77f-46ad-9c46-843407134626. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1195.836419] env[62277]: DEBUG oslo_concurrency.lockutils [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] Acquiring lock "refresh_cache-3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1195.836552] env[62277]: DEBUG oslo_concurrency.lockutils [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] Acquired lock "refresh_cache-3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1195.836701] env[62277]: DEBUG nova.network.neutron [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Refreshing network info cache for port ce3cacf5-c77f-46ad-9c46-843407134626 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1196.031829] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405365, 'name': CreateVM_Task, 'duration_secs': 0.364863} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1196.032007] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1196.032841] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1196.033007] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1196.033408] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1196.033658] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-82a69811-235b-47cd-86f0-63d76c98827c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1196.037965] env[62277]: DEBUG oslo_vmware.api [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Waiting for the task: (returnval){ [ 1196.037965] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]526a0da7-a60b-3fa9-0723-c06c625f2d94" [ 1196.037965] env[62277]: _type = "Task" [ 1196.037965] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1196.052305] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1196.052538] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1196.052741] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1196.159798] env[62277]: DEBUG nova.network.neutron [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Updated VIF entry in instance network info cache for port ce3cacf5-c77f-46ad-9c46-843407134626. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1196.160341] env[62277]: DEBUG nova.network.neutron [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Updating instance_info_cache with network_info: [{"id": "9c8ca84a-b847-459d-9f41-aa4c968f1222", "address": "fa:16:3e:af:aa:86", "network": {"id": "40af90f0-0e9a-4d37-8f0f-f65081a64caa", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1394444417", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d0d30b51245940e8a538b261979633b2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "55bd18a7-39a8-4d07-9088-9b944f9ff710", "external-id": "nsx-vlan-transportzone-686", "segmentation_id": 686, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9c8ca84a-b8", "ovs_interfaceid": "9c8ca84a-b847-459d-9f41-aa4c968f1222", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ce3cacf5-c77f-46ad-9c46-843407134626", "address": "fa:16:3e:98:8f:33", "network": {"id": "816d63ff-718c-4730-b7a6-c4e9fffe2346", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-120807393", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "d0d30b51245940e8a538b261979633b2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46785c9c-8b22-487d-a854-b3e67c5ed1d7", "external-id": "nsx-vlan-transportzone-430", "segmentation_id": 430, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapce3cacf5-c7", "ovs_interfaceid": "ce3cacf5-c77f-46ad-9c46-843407134626", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "78db1554-8dd0-4284-a010-10c296868f4e", "address": "fa:16:3e:e8:14:85", "network": {"id": "40af90f0-0e9a-4d37-8f0f-f65081a64caa", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1394444417", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.144", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d0d30b51245940e8a538b261979633b2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "55bd18a7-39a8-4d07-9088-9b944f9ff710", "external-id": "nsx-vlan-transportzone-686", "segmentation_id": 686, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap78db1554-8d", "ovs_interfaceid": "78db1554-8dd0-4284-a010-10c296868f4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1196.171743] env[62277]: DEBUG oslo_concurrency.lockutils [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] Releasing lock "refresh_cache-3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1196.172021] env[62277]: DEBUG nova.compute.manager [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Received event network-vif-plugged-78db1554-8dd0-4284-a010-10c296868f4e {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1196.172288] env[62277]: DEBUG oslo_concurrency.lockutils [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] Acquiring lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1196.172564] env[62277]: DEBUG oslo_concurrency.lockutils [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] Lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1196.172647] env[62277]: DEBUG oslo_concurrency.lockutils [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] Lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1196.172736] env[62277]: DEBUG nova.compute.manager [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] No waiting events found dispatching network-vif-plugged-78db1554-8dd0-4284-a010-10c296868f4e {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1196.172898] env[62277]: WARNING nova.compute.manager [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Received unexpected event network-vif-plugged-78db1554-8dd0-4284-a010-10c296868f4e for instance with vm_state building and task_state spawning. [ 1196.173112] env[62277]: DEBUG nova.compute.manager [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Received event network-changed-78db1554-8dd0-4284-a010-10c296868f4e {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1196.173305] env[62277]: DEBUG nova.compute.manager [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Refreshing instance network info cache due to event network-changed-78db1554-8dd0-4284-a010-10c296868f4e. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1196.173517] env[62277]: DEBUG oslo_concurrency.lockutils [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] Acquiring lock "refresh_cache-3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1196.173657] env[62277]: DEBUG oslo_concurrency.lockutils [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] Acquired lock "refresh_cache-3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1196.173810] env[62277]: DEBUG nova.network.neutron [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Refreshing network info cache for port 78db1554-8dd0-4284-a010-10c296868f4e {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1196.512137] env[62277]: DEBUG nova.network.neutron [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Updated VIF entry in instance network info cache for port 78db1554-8dd0-4284-a010-10c296868f4e. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1196.512137] env[62277]: DEBUG nova.network.neutron [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Updating instance_info_cache with network_info: [{"id": "9c8ca84a-b847-459d-9f41-aa4c968f1222", "address": "fa:16:3e:af:aa:86", "network": {"id": "40af90f0-0e9a-4d37-8f0f-f65081a64caa", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1394444417", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.238", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d0d30b51245940e8a538b261979633b2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "55bd18a7-39a8-4d07-9088-9b944f9ff710", "external-id": "nsx-vlan-transportzone-686", "segmentation_id": 686, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9c8ca84a-b8", "ovs_interfaceid": "9c8ca84a-b847-459d-9f41-aa4c968f1222", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ce3cacf5-c77f-46ad-9c46-843407134626", "address": "fa:16:3e:98:8f:33", "network": {"id": "816d63ff-718c-4730-b7a6-c4e9fffe2346", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-120807393", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.176", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "d0d30b51245940e8a538b261979633b2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "46785c9c-8b22-487d-a854-b3e67c5ed1d7", "external-id": "nsx-vlan-transportzone-430", "segmentation_id": 430, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapce3cacf5-c7", "ovs_interfaceid": "ce3cacf5-c77f-46ad-9c46-843407134626", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "78db1554-8dd0-4284-a010-10c296868f4e", "address": "fa:16:3e:e8:14:85", "network": {"id": "40af90f0-0e9a-4d37-8f0f-f65081a64caa", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1394444417", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.144", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d0d30b51245940e8a538b261979633b2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "55bd18a7-39a8-4d07-9088-9b944f9ff710", "external-id": "nsx-vlan-transportzone-686", "segmentation_id": 686, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap78db1554-8d", "ovs_interfaceid": "78db1554-8dd0-4284-a010-10c296868f4e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1196.521144] env[62277]: DEBUG oslo_concurrency.lockutils [req-0d1fa9e3-afe7-4dc1-9da2-20170bd25132 req-7ecee854-557d-4185-ae69-0e5ae04dc93a service nova] Releasing lock "refresh_cache-3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1201.311474] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e6449e7e-5171-4d0c-87cc-f1356b66e477 tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Acquiring lock "68925f1b-da69-4955-acb1-d6500b03daee" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1208.475289] env[62277]: DEBUG oslo_concurrency.lockutils [None req-94ebb12a-e413-4ba8-b304-6c4ff8e832a5 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Acquiring lock "866c4415-caab-4d81-86ba-ed662feb3c4f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1208.598589] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Acquiring lock "154fa64c-55d4-4b72-8af9-39e72fd5df5f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1208.598819] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Lock "154fa64c-55d4-4b72-8af9-39e72fd5df5f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1208.599027] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Acquiring lock "154fa64c-55d4-4b72-8af9-39e72fd5df5f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1208.600023] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Lock "154fa64c-55d4-4b72-8af9-39e72fd5df5f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1208.600023] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Lock "154fa64c-55d4-4b72-8af9-39e72fd5df5f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1208.601765] env[62277]: INFO nova.compute.manager [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Terminating instance [ 1208.603480] env[62277]: DEBUG nova.compute.manager [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1208.603658] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Powering off the VM {{(pid=62277) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 1208.605344] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-b4882fa8-30ce-4924-a9c3-2ebf0f02b5fe {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1208.611619] env[62277]: DEBUG oslo_vmware.api [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Waiting for the task: (returnval){ [ 1208.611619] env[62277]: value = "task-1405366" [ 1208.611619] env[62277]: _type = "Task" [ 1208.611619] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1208.619742] env[62277]: DEBUG oslo_vmware.api [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405366, 'name': PowerOffVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1209.121390] env[62277]: DEBUG oslo_vmware.api [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405366, 'name': PowerOffVM_Task, 'duration_secs': 0.171195} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1209.121673] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Powered off the VM {{(pid=62277) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1507}} [ 1209.121862] env[62277]: DEBUG nova.virt.vmwareapi.volumeops [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Volume detach. Driver type: vmdk {{(pid=62277) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 1209.122832] env[62277]: DEBUG nova.virt.vmwareapi.volumeops [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-297813', 'volume_id': '8127ec42-b0f5-4086-b761-68b728ea559b', 'name': 'volume-8127ec42-b0f5-4086-b761-68b728ea559b', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '154fa64c-55d4-4b72-8af9-39e72fd5df5f', 'attached_at': '', 'detached_at': '', 'volume_id': '8127ec42-b0f5-4086-b761-68b728ea559b', 'serial': '8127ec42-b0f5-4086-b761-68b728ea559b'} {{(pid=62277) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 1209.122832] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd403bb6-4a0b-46b6-8a7b-6a514808c675 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.140130] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-063a7acc-20fb-437d-beca-785d1cf5cd21 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.146375] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8fd59fc-9a3b-44d4-9d57-5ce0631127fa {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.162957] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27efd474-7e22-4cbd-af09-98c17b43a1a5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.178523] env[62277]: DEBUG nova.virt.vmwareapi.volumeops [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] The volume has not been displaced from its original location: [datastore2] volume-8127ec42-b0f5-4086-b761-68b728ea559b/volume-8127ec42-b0f5-4086-b761-68b728ea559b.vmdk. No consolidation needed. {{(pid=62277) _consolidate_vmdk_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:504}} [ 1209.183710] env[62277]: DEBUG nova.virt.vmwareapi.volumeops [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Reconfiguring VM instance instance-00000010 to detach disk 2000 {{(pid=62277) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:122}} [ 1209.183985] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-e2f70d96-0ce9-4cd9-8caf-80b52bfcba3b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.202465] env[62277]: DEBUG oslo_vmware.api [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Waiting for the task: (returnval){ [ 1209.202465] env[62277]: value = "task-1405367" [ 1209.202465] env[62277]: _type = "Task" [ 1209.202465] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1209.210349] env[62277]: DEBUG oslo_vmware.api [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405367, 'name': ReconfigVM_Task} progress is 5%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1209.284795] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0871020e-a31b-4cab-abc4-f4a95f73bc14 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Acquiring lock "350e2302-66b9-4dd6-b0f4-77000992408b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1209.712539] env[62277]: DEBUG oslo_vmware.api [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405367, 'name': ReconfigVM_Task, 'duration_secs': 0.153393} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1209.712865] env[62277]: DEBUG nova.virt.vmwareapi.volumeops [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Reconfigured VM instance instance-00000010 to detach disk 2000 {{(pid=62277) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:127}} [ 1209.717623] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-1dc9d38f-c1fb-4d90-9655-5989492fb6f9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1209.734451] env[62277]: DEBUG oslo_vmware.api [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Waiting for the task: (returnval){ [ 1209.734451] env[62277]: value = "task-1405368" [ 1209.734451] env[62277]: _type = "Task" [ 1209.734451] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1209.742756] env[62277]: DEBUG oslo_vmware.api [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405368, 'name': ReconfigVM_Task} progress is 5%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1210.168906] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1210.169038] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Cleaning up deleted instances {{(pid=62277) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11196}} [ 1210.184861] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] There are 0 instances to clean {{(pid=62277) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11205}} [ 1210.244249] env[62277]: DEBUG oslo_vmware.api [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405368, 'name': ReconfigVM_Task} progress is 14%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1210.745415] env[62277]: DEBUG oslo_vmware.api [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405368, 'name': ReconfigVM_Task, 'duration_secs': 0.68121} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1210.745751] env[62277]: DEBUG nova.virt.vmwareapi.volumeops [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Detached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-297813', 'volume_id': '8127ec42-b0f5-4086-b761-68b728ea559b', 'name': 'volume-8127ec42-b0f5-4086-b761-68b728ea559b', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '154fa64c-55d4-4b72-8af9-39e72fd5df5f', 'attached_at': '', 'detached_at': '', 'volume_id': '8127ec42-b0f5-4086-b761-68b728ea559b', 'serial': '8127ec42-b0f5-4086-b761-68b728ea559b'} {{(pid=62277) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:605}} [ 1210.745998] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1210.746746] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fdb1f4f-33bf-41a8-a7d6-e79c6064de38 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1210.753222] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1210.753471] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-bc809a10-73c0-4cc1-b4d9-00b8f54e28a4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1210.839415] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1210.839631] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1210.839807] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Deleting the datastore file [datastore2] 154fa64c-55d4-4b72-8af9-39e72fd5df5f {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1210.840105] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-783b75ff-e110-4ad2-93a9-937200be3b11 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1210.846974] env[62277]: DEBUG oslo_vmware.api [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Waiting for the task: (returnval){ [ 1210.846974] env[62277]: value = "task-1405370" [ 1210.846974] env[62277]: _type = "Task" [ 1210.846974] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1210.855440] env[62277]: DEBUG oslo_vmware.api [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405370, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1211.357446] env[62277]: DEBUG oslo_vmware.api [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Task: {'id': task-1405370, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077891} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1211.357702] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1211.357882] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1211.358068] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1211.358243] env[62277]: INFO nova.compute.manager [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Took 2.75 seconds to destroy the instance on the hypervisor. [ 1211.358489] env[62277]: DEBUG oslo.service.loopingcall [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1211.358689] env[62277]: DEBUG nova.compute.manager [-] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1211.358791] env[62277]: DEBUG nova.network.neutron [-] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1212.059051] env[62277]: DEBUG nova.network.neutron [-] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1212.072135] env[62277]: INFO nova.compute.manager [-] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Took 0.71 seconds to deallocate network for instance. [ 1212.125670] env[62277]: DEBUG nova.compute.manager [req-4bdc1a7a-d66d-4d7d-9291-266f998ab89c req-bbd60cd2-1391-47c9-b27e-35dd267064e0 service nova] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Received event network-vif-deleted-09dcbb3a-90a6-4559-be66-b51fd18ff545 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1212.144848] env[62277]: INFO nova.compute.manager [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Took 0.07 seconds to detach 1 volumes for instance. [ 1212.146620] env[62277]: DEBUG nova.compute.manager [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Deleting volume: 8127ec42-b0f5-4086-b761-68b728ea559b {{(pid=62277) _cleanup_volumes /opt/stack/nova/nova/compute/manager.py:3221}} [ 1212.273326] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1212.273682] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1212.273957] env[62277]: DEBUG nova.objects.instance [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Lazy-loading 'resources' on Instance uuid 154fa64c-55d4-4b72-8af9-39e72fd5df5f {{(pid=62277) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} [ 1212.465123] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b01d96ad-e621-46ce-b976-851d54e12df8 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Acquiring lock "23bc5a48-3e96-4897-bf28-ad14a0bdde62" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1212.749552] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d5811e4-8351-4682-91c5-7536b92be93e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1212.756906] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34c20793-f2e9-4268-8a5d-47f0af5c2874 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1212.792135] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6af22cf2-8e85-445e-a071-99d48657684d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1212.802059] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d081965-c20e-49e6-a779-75530c0607a4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1212.821510] env[62277]: DEBUG nova.compute.provider_tree [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1212.831692] env[62277]: DEBUG nova.scheduler.client.report [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1212.855745] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.582s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1212.878566] env[62277]: INFO nova.scheduler.client.report [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Deleted allocations for instance 154fa64c-55d4-4b72-8af9-39e72fd5df5f [ 1212.958296] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cda1dfcf-8c90-4c50-aa5f-bc40dc9d6400 tempest-ServersTestBootFromVolume-1085470207 tempest-ServersTestBootFromVolume-1085470207-project-member] Lock "154fa64c-55d4-4b72-8af9-39e72fd5df5f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 4.359s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1213.678457] env[62277]: DEBUG oslo_concurrency.lockutils [None req-22722f1a-8fe1-4435-bba7-68377be5b793 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Acquiring lock "346748bd-b4e8-4e93-b71d-66c90a45e372" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1215.184615] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1215.184903] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 1215.185025] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 1215.208045] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1215.208045] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1215.208045] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1215.208254] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1215.208254] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1215.208400] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1215.208518] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1215.208644] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1215.208782] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1215.208905] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1215.209036] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 1215.209834] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1217.171115] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1217.171115] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1217.171115] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Cleaning up deleted instances with incomplete migration {{(pid=62277) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11234}} [ 1218.178798] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1219.165121] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1219.193640] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1219.194225] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1219.210017] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1219.210017] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1219.210017] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1219.210017] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1219.210017] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58f89865-a79a-4c9e-9fcc-fcee6f4fff38 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1219.220012] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4550736f-f3e8-45c1-8e28-a88b9b4fcd93 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1219.242020] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29fd2903-e07f-417f-86d3-563e0695ffc4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1219.247306] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9f66aef-1ea7-4b54-96ac-acfb8c7689a2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1219.278255] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181390MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1219.278453] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1219.278676] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1219.469921] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 19d15611-315f-4c4b-8f32-e5d00d0d8ca8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1219.470117] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance dfc291fd-1481-4e76-9fb3-ec87124c1281 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1219.470248] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 36ff1435-1999-4e95-8920-81a1b25cc452 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1219.470371] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 68925f1b-da69-4955-acb1-d6500b03daee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1219.470522] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 866c4415-caab-4d81-86ba-ed662feb3c4f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1219.470608] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 350e2302-66b9-4dd6-b0f4-77000992408b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1219.470726] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 23bc5a48-3e96-4897-bf28-ad14a0bdde62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1219.470843] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 346748bd-b4e8-4e93-b71d-66c90a45e372 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1219.470958] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 930ff058-ab48-4c8a-8f5e-4820a3b12d50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1219.471097] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1219.486486] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 79177b7a-1bf6-4649-80c7-4ba2c6cda0ad has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1219.502195] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4cc3443d-3d05-4d3b-a222-6b7367c1c989 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1219.514805] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 04122dba-4cbf-4176-95bd-f1c4bfaa799e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1219.528792] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 30d7d279-7241-4f2e-b963-d205a5f9fa41 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1219.540784] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance fb905507-549f-4265-9851-4a42930c02a4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1219.552820] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4f5018a5-26f5-4a31-8a9d-d2557c906995 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1219.564490] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c01ad807-a9a5-4028-baf4-0469a6301459 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1219.575575] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 1e8429b2-7149-4832-8590-e0ebd8501176 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1219.587426] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance f646a534-1ae8-40dd-9819-3d71bda87ae2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1219.598633] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 2b98866e-3c86-47bd-9eff-2c2743631563 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1219.610359] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 560b7750-03fe-4a4c-ab1d-a1751895986b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1219.623846] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 86e8e8ba-e476-400d-b180-bb7df8a042d8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1219.634930] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c3e72352-f795-4ce7-9e0b-4e80c4329f7b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1219.645718] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance b6908c32-5916-4a0e-92e2-21f480c5f7ca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1219.658018] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 27cd13ca-a17c-476e-a00a-cca1fe898763 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1219.670777] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c000e183-2e57-470e-a9a5-30b5899e77c1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1219.684786] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 21bd4623-2b46-43c4-859f-c4d3bf261e1f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1219.705039] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6d759045-e1fc-43ea-a882-1ead769b6d29 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1219.705310] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1219.705459] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1219.723639] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Refreshing inventories for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1219.749013] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Updating ProviderTree inventory for provider 75e125ea-a599-4b65-b9cd-6ea881735292 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1219.749230] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Updating inventory in ProviderTree for provider 75e125ea-a599-4b65-b9cd-6ea881735292 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1219.764852] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Refreshing aggregate associations for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292, aggregates: None {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1219.784861] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Refreshing trait associations for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1220.210708] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f006a093-d536-4aa6-b396-5bd499666587 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1220.219029] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0d46b56-e7d0-4587-a43d-1539802a2cc5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1220.252721] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56656697-40bc-4258-8412-13cde2f61c83 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1220.261288] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a06e7080-af7e-4047-a474-66a8efcff1d4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1220.277450] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1220.286898] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1220.303698] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1220.303698] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.024s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1221.303560] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1221.303560] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1222.263322] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b49b6dd-08af-401b-8be0-58a31ad8d1f7 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Acquiring lock "930ff058-ab48-4c8a-8f5e-4820a3b12d50" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1222.308214] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Acquiring lock "32bed248-06d5-47a1-b281-47921d99dbf6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1222.308214] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Lock "32bed248-06d5-47a1-b281-47921d99dbf6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1222.678938] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f8f6abaf-c69a-41fb-8aab-0ac498ab193e tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Acquiring lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1224.169428] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1225.176535] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1225.176810] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 1227.891441] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5ffc1411-5f8e-44bd-8435-e2c993f18ac8 tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] Acquiring lock "e573e784-3318-4a41-89fd-40cbe8749413" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1227.891712] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5ffc1411-5f8e-44bd-8435-e2c993f18ac8 tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] Lock "e573e784-3318-4a41-89fd-40cbe8749413" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1227.920265] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5ffc1411-5f8e-44bd-8435-e2c993f18ac8 tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] Acquiring lock "06571cd1-61a1-48e9-a204-624d5f383ad3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1227.920265] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5ffc1411-5f8e-44bd-8435-e2c993f18ac8 tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] Lock "06571cd1-61a1-48e9-a204-624d5f383ad3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1228.335690] env[62277]: WARNING oslo_vmware.rw_handles [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1228.335690] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1228.335690] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1228.335690] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1228.335690] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1228.335690] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1228.335690] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1228.335690] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1228.335690] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1228.335690] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1228.335690] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1228.335690] env[62277]: ERROR oslo_vmware.rw_handles [ 1228.336392] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/6a1cbc53-a14e-495a-bca1-8725587c770c/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1228.338614] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1228.339049] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Copying Virtual Disk [datastore2] vmware_temp/6a1cbc53-a14e-495a-bca1-8725587c770c/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/6a1cbc53-a14e-495a-bca1-8725587c770c/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1228.339507] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a3d1cebb-9e76-4907-b73c-b6b734f70f64 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1228.348296] env[62277]: DEBUG oslo_vmware.api [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Waiting for the task: (returnval){ [ 1228.348296] env[62277]: value = "task-1405372" [ 1228.348296] env[62277]: _type = "Task" [ 1228.348296] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1228.359215] env[62277]: DEBUG oslo_vmware.api [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Task: {'id': task-1405372, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1228.859385] env[62277]: DEBUG oslo_vmware.exceptions [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1228.859776] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1228.860440] env[62277]: ERROR nova.compute.manager [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1228.860440] env[62277]: Faults: ['InvalidArgument'] [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Traceback (most recent call last): [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] yield resources [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] self.driver.spawn(context, instance, image_meta, [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] self._fetch_image_if_missing(context, vi) [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] image_cache(vi, tmp_image_ds_loc) [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] vm_util.copy_virtual_disk( [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] session._wait_for_task(vmdk_copy_task) [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] return self.wait_for_task(task_ref) [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] return evt.wait() [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] result = hub.switch() [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] return self.greenlet.switch() [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] self.f(*self.args, **self.kw) [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] raise exceptions.translate_fault(task_info.error) [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Faults: ['InvalidArgument'] [ 1228.860440] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] [ 1228.861336] env[62277]: INFO nova.compute.manager [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Terminating instance [ 1228.863040] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1228.863298] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1228.864044] env[62277]: DEBUG nova.compute.manager [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1228.867206] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1228.867206] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3af6eab5-c492-45dd-afd3-ea53d68d45b0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1228.868168] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d2b1872-7b7a-4cf7-941d-85a34f1e2ad7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1228.875844] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1228.878238] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-65b81260-95f4-4a26-9b46-3bba41c2ce5b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1228.878656] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1228.878878] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1228.879702] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e964bb2d-99ec-4805-b920-67eda9ed0e7e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1228.885367] env[62277]: DEBUG oslo_vmware.api [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Waiting for the task: (returnval){ [ 1228.885367] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]524bf361-1103-4489-aca7-71c76efac69d" [ 1228.885367] env[62277]: _type = "Task" [ 1228.885367] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1228.893178] env[62277]: DEBUG oslo_vmware.api [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]524bf361-1103-4489-aca7-71c76efac69d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1228.995922] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1228.996205] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1228.996420] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Deleting the datastore file [datastore2] 19d15611-315f-4c4b-8f32-e5d00d0d8ca8 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1228.996721] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-40cc5dc4-fdd5-4cde-a0d8-2cd131e39dd9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1229.003114] env[62277]: DEBUG oslo_vmware.api [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Waiting for the task: (returnval){ [ 1229.003114] env[62277]: value = "task-1405374" [ 1229.003114] env[62277]: _type = "Task" [ 1229.003114] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1229.010702] env[62277]: DEBUG oslo_vmware.api [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Task: {'id': task-1405374, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1229.395806] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1229.396097] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Creating directory with path [datastore2] vmware_temp/0271724a-606f-404a-b072-f17e15265de5/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1229.396339] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bb2be3c9-c881-4e74-b23e-ecdbe65a0b07 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1229.407830] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Created directory with path [datastore2] vmware_temp/0271724a-606f-404a-b072-f17e15265de5/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1229.408101] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Fetch image to [datastore2] vmware_temp/0271724a-606f-404a-b072-f17e15265de5/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1229.408212] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/0271724a-606f-404a-b072-f17e15265de5/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1229.408950] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0564f357-1707-460e-904b-f7ae3517f86a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1229.415781] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41c19760-c3df-4f12-9d9d-e79693de9509 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1229.425596] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6021977-f5ac-40e8-8159-bf34da3da89d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1229.461490] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-445fc537-d242-4f16-9641-a4d4011d2336 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1229.468050] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-bfa717ef-e2ac-4bb9-a249-a1904f742e64 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1229.489349] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1229.519447] env[62277]: DEBUG oslo_vmware.api [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Task: {'id': task-1405374, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071433} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1229.522765] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1229.523019] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1229.523223] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1229.525312] env[62277]: INFO nova.compute.manager [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Took 0.66 seconds to destroy the instance on the hypervisor. [ 1229.525614] env[62277]: DEBUG nova.compute.claims [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1229.525701] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1229.525934] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1229.549295] env[62277]: DEBUG oslo_vmware.rw_handles [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0271724a-606f-404a-b072-f17e15265de5/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1229.614454] env[62277]: DEBUG oslo_vmware.rw_handles [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1229.614702] env[62277]: DEBUG oslo_vmware.rw_handles [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0271724a-606f-404a-b072-f17e15265de5/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1229.984649] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0db2acf0-4857-45b5-b696-15e8d49840e5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1229.993752] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df6b9ceb-d708-42ba-a9a2-53d520578ea8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1230.023401] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c14b0ac1-8fab-4447-984b-15dd08b9e636 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1230.030767] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77695c0c-f7d3-42f5-8b5d-4bb6fc929cb0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1230.044967] env[62277]: DEBUG nova.compute.provider_tree [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1230.059712] env[62277]: DEBUG nova.scheduler.client.report [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1230.076026] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.549s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1230.076126] env[62277]: ERROR nova.compute.manager [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1230.076126] env[62277]: Faults: ['InvalidArgument'] [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Traceback (most recent call last): [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] self.driver.spawn(context, instance, image_meta, [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] self._fetch_image_if_missing(context, vi) [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] image_cache(vi, tmp_image_ds_loc) [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] vm_util.copy_virtual_disk( [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] session._wait_for_task(vmdk_copy_task) [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] return self.wait_for_task(task_ref) [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] return evt.wait() [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] result = hub.switch() [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] return self.greenlet.switch() [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] self.f(*self.args, **self.kw) [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] raise exceptions.translate_fault(task_info.error) [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Faults: ['InvalidArgument'] [ 1230.076126] env[62277]: ERROR nova.compute.manager [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] [ 1230.076993] env[62277]: DEBUG nova.compute.utils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1230.078347] env[62277]: DEBUG nova.compute.manager [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Build of instance 19d15611-315f-4c4b-8f32-e5d00d0d8ca8 was re-scheduled: A specified parameter was not correct: fileType [ 1230.078347] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1230.078693] env[62277]: DEBUG nova.compute.manager [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1230.078864] env[62277]: DEBUG nova.compute.manager [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1230.079034] env[62277]: DEBUG nova.compute.manager [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1230.079192] env[62277]: DEBUG nova.network.neutron [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1230.567164] env[62277]: DEBUG nova.network.neutron [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1230.589921] env[62277]: INFO nova.compute.manager [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Took 0.51 seconds to deallocate network for instance. [ 1230.710821] env[62277]: INFO nova.scheduler.client.report [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Deleted allocations for instance 19d15611-315f-4c4b-8f32-e5d00d0d8ca8 [ 1230.734199] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7fde529c-2746-430b-bfbe-e20e113496e7 tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Lock "19d15611-315f-4c4b-8f32-e5d00d0d8ca8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 239.351s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1230.735384] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b6b53b76-79f6-467b-a6d4-2655043bf5de tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Lock "19d15611-315f-4c4b-8f32-e5d00d0d8ca8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 38.269s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1230.735600] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b6b53b76-79f6-467b-a6d4-2655043bf5de tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Acquiring lock "19d15611-315f-4c4b-8f32-e5d00d0d8ca8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1230.735862] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b6b53b76-79f6-467b-a6d4-2655043bf5de tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Lock "19d15611-315f-4c4b-8f32-e5d00d0d8ca8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1230.735950] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b6b53b76-79f6-467b-a6d4-2655043bf5de tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Lock "19d15611-315f-4c4b-8f32-e5d00d0d8ca8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1230.740288] env[62277]: INFO nova.compute.manager [None req-b6b53b76-79f6-467b-a6d4-2655043bf5de tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Terminating instance [ 1230.742244] env[62277]: DEBUG nova.compute.manager [None req-b6b53b76-79f6-467b-a6d4-2655043bf5de tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1230.742565] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b53b76-79f6-467b-a6d4-2655043bf5de tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1230.743240] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c4292719-533e-4463-be45-81c60f712ed6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1230.752530] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0debe43-a514-4d2f-b8f0-6f6d38f99c15 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1230.764606] env[62277]: DEBUG nova.compute.manager [None req-ffd72be3-f106-43c5-b3e1-558211d8f7ca tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] [instance: 79177b7a-1bf6-4649-80c7-4ba2c6cda0ad] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1230.785787] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-b6b53b76-79f6-467b-a6d4-2655043bf5de tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 19d15611-315f-4c4b-8f32-e5d00d0d8ca8 could not be found. [ 1230.786098] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b53b76-79f6-467b-a6d4-2655043bf5de tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1230.786342] env[62277]: INFO nova.compute.manager [None req-b6b53b76-79f6-467b-a6d4-2655043bf5de tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1230.786662] env[62277]: DEBUG oslo.service.loopingcall [None req-b6b53b76-79f6-467b-a6d4-2655043bf5de tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1230.787241] env[62277]: DEBUG nova.compute.manager [-] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1230.787405] env[62277]: DEBUG nova.network.neutron [-] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1230.792670] env[62277]: DEBUG nova.compute.manager [None req-ffd72be3-f106-43c5-b3e1-558211d8f7ca tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] [instance: 79177b7a-1bf6-4649-80c7-4ba2c6cda0ad] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1230.815216] env[62277]: DEBUG nova.network.neutron [-] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1230.833109] env[62277]: INFO nova.compute.manager [-] [instance: 19d15611-315f-4c4b-8f32-e5d00d0d8ca8] Took 0.04 seconds to deallocate network for instance. [ 1230.841755] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ffd72be3-f106-43c5-b3e1-558211d8f7ca tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] Lock "79177b7a-1bf6-4649-80c7-4ba2c6cda0ad" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.405s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1230.855176] env[62277]: DEBUG nova.compute.manager [None req-ffd72be3-f106-43c5-b3e1-558211d8f7ca tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] [instance: 4cc3443d-3d05-4d3b-a222-6b7367c1c989] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1230.889474] env[62277]: DEBUG nova.compute.manager [None req-ffd72be3-f106-43c5-b3e1-558211d8f7ca tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] [instance: 4cc3443d-3d05-4d3b-a222-6b7367c1c989] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1230.921838] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ffd72be3-f106-43c5-b3e1-558211d8f7ca tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] Lock "4cc3443d-3d05-4d3b-a222-6b7367c1c989" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.451s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1230.935682] env[62277]: DEBUG nova.compute.manager [None req-34538215-56b8-42d9-8162-924ac491e844 tempest-ServerMetadataTestJSON-1737770471 tempest-ServerMetadataTestJSON-1737770471-project-member] [instance: 04122dba-4cbf-4176-95bd-f1c4bfaa799e] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1230.972611] env[62277]: DEBUG nova.compute.manager [None req-34538215-56b8-42d9-8162-924ac491e844 tempest-ServerMetadataTestJSON-1737770471 tempest-ServerMetadataTestJSON-1737770471-project-member] [instance: 04122dba-4cbf-4176-95bd-f1c4bfaa799e] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1230.983146] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b6b53b76-79f6-467b-a6d4-2655043bf5de tempest-ServersTestFqdnHostnames-1297902273 tempest-ServersTestFqdnHostnames-1297902273-project-member] Lock "19d15611-315f-4c4b-8f32-e5d00d0d8ca8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.248s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1231.003817] env[62277]: DEBUG oslo_concurrency.lockutils [None req-34538215-56b8-42d9-8162-924ac491e844 tempest-ServerMetadataTestJSON-1737770471 tempest-ServerMetadataTestJSON-1737770471-project-member] Lock "04122dba-4cbf-4176-95bd-f1c4bfaa799e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.860s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1231.014561] env[62277]: DEBUG nova.compute.manager [None req-f6182b83-f1e1-4d2b-91e7-5f82871cf9d7 tempest-FloatingIPsAssociationNegativeTestJSON-1284752111 tempest-FloatingIPsAssociationNegativeTestJSON-1284752111-project-member] [instance: 30d7d279-7241-4f2e-b963-d205a5f9fa41] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1231.038175] env[62277]: DEBUG nova.compute.manager [None req-f6182b83-f1e1-4d2b-91e7-5f82871cf9d7 tempest-FloatingIPsAssociationNegativeTestJSON-1284752111 tempest-FloatingIPsAssociationNegativeTestJSON-1284752111-project-member] [instance: 30d7d279-7241-4f2e-b963-d205a5f9fa41] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1231.063806] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f6182b83-f1e1-4d2b-91e7-5f82871cf9d7 tempest-FloatingIPsAssociationNegativeTestJSON-1284752111 tempest-FloatingIPsAssociationNegativeTestJSON-1284752111-project-member] Lock "30d7d279-7241-4f2e-b963-d205a5f9fa41" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.790s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1231.079608] env[62277]: DEBUG nova.compute.manager [None req-cda784f4-a56f-494b-9766-89b311c4a626 tempest-ServerActionsTestOtherA-1333146290 tempest-ServerActionsTestOtherA-1333146290-project-member] [instance: fb905507-549f-4265-9851-4a42930c02a4] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1231.104483] env[62277]: DEBUG nova.compute.manager [None req-cda784f4-a56f-494b-9766-89b311c4a626 tempest-ServerActionsTestOtherA-1333146290 tempest-ServerActionsTestOtherA-1333146290-project-member] [instance: fb905507-549f-4265-9851-4a42930c02a4] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1231.126391] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cda784f4-a56f-494b-9766-89b311c4a626 tempest-ServerActionsTestOtherA-1333146290 tempest-ServerActionsTestOtherA-1333146290-project-member] Lock "fb905507-549f-4265-9851-4a42930c02a4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.943s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1231.137724] env[62277]: DEBUG nova.compute.manager [None req-17030804-65dc-46c0-96b3-152f53ed921a tempest-FloatingIPsAssociationTestJSON-809897368 tempest-FloatingIPsAssociationTestJSON-809897368-project-member] [instance: 4f5018a5-26f5-4a31-8a9d-d2557c906995] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1231.161666] env[62277]: DEBUG nova.compute.manager [None req-17030804-65dc-46c0-96b3-152f53ed921a tempest-FloatingIPsAssociationTestJSON-809897368 tempest-FloatingIPsAssociationTestJSON-809897368-project-member] [instance: 4f5018a5-26f5-4a31-8a9d-d2557c906995] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1231.194767] env[62277]: DEBUG oslo_concurrency.lockutils [None req-17030804-65dc-46c0-96b3-152f53ed921a tempest-FloatingIPsAssociationTestJSON-809897368 tempest-FloatingIPsAssociationTestJSON-809897368-project-member] Lock "4f5018a5-26f5-4a31-8a9d-d2557c906995" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.985s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1231.208220] env[62277]: DEBUG nova.compute.manager [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1231.265781] env[62277]: DEBUG oslo_concurrency.lockutils [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1231.266074] env[62277]: DEBUG oslo_concurrency.lockutils [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1231.267717] env[62277]: INFO nova.compute.claims [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1231.704323] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfb69ff3-0f5e-468e-b1ba-dea36771f237 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1231.712322] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30217141-d7cd-4c5b-8212-6053000603e3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1231.742949] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-790a0608-4680-4ad7-98dc-fb330acda150 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1231.751975] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84253b20-2935-4b9a-90b5-0364a5bd8bfd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1231.766882] env[62277]: DEBUG nova.compute.provider_tree [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1231.780081] env[62277]: DEBUG nova.scheduler.client.report [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1231.799386] env[62277]: DEBUG oslo_concurrency.lockutils [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.533s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1231.800016] env[62277]: DEBUG nova.compute.manager [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1231.811618] env[62277]: DEBUG oslo_concurrency.lockutils [None req-890e0385-1650-49d4-a473-78033af541ec tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Acquiring lock "c01ad807-a9a5-4028-baf4-0469a6301459" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1231.834852] env[62277]: DEBUG nova.compute.claims [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1231.835064] env[62277]: DEBUG oslo_concurrency.lockutils [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1231.835285] env[62277]: DEBUG oslo_concurrency.lockutils [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1232.242017] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72dc9eeb-4e39-41d5-8418-09108065faee {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1232.248931] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09ed5b7d-77f6-4b31-a443-81182d7c1eb0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1232.280118] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ad1630d-b9ca-4c44-a96a-a6092b6db3c6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1232.288269] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70b8cb24-3986-412f-b939-8d2835a9b05a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1232.301905] env[62277]: DEBUG nova.compute.provider_tree [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1232.314996] env[62277]: DEBUG nova.scheduler.client.report [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1232.334387] env[62277]: DEBUG oslo_concurrency.lockutils [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.499s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1232.337029] env[62277]: DEBUG nova.compute.utils [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Conflict updating instance c01ad807-a9a5-4028-baf4-0469a6301459. Expected: {'task_state': [None]}. Actual: {'task_state': 'deleting'} {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1232.337029] env[62277]: DEBUG nova.compute.manager [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Instance disappeared during build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2487}} [ 1232.337425] env[62277]: DEBUG nova.compute.manager [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1232.337761] env[62277]: DEBUG oslo_concurrency.lockutils [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Acquiring lock "refresh_cache-c01ad807-a9a5-4028-baf4-0469a6301459" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1232.338057] env[62277]: DEBUG oslo_concurrency.lockutils [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Acquired lock "refresh_cache-c01ad807-a9a5-4028-baf4-0469a6301459" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1232.338711] env[62277]: DEBUG nova.network.neutron [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1232.450113] env[62277]: DEBUG nova.network.neutron [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1232.709431] env[62277]: DEBUG nova.network.neutron [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1232.729162] env[62277]: DEBUG oslo_concurrency.lockutils [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Releasing lock "refresh_cache-c01ad807-a9a5-4028-baf4-0469a6301459" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1232.729162] env[62277]: DEBUG nova.compute.manager [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1232.729162] env[62277]: DEBUG nova.compute.manager [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1232.729162] env[62277]: DEBUG nova.network.neutron [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1232.747963] env[62277]: DEBUG nova.network.neutron [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1232.758136] env[62277]: DEBUG nova.network.neutron [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1232.767058] env[62277]: INFO nova.compute.manager [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Took 0.04 seconds to deallocate network for instance. [ 1232.859083] env[62277]: INFO nova.scheduler.client.report [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Deleted allocations for instance c01ad807-a9a5-4028-baf4-0469a6301459 [ 1232.859374] env[62277]: DEBUG oslo_concurrency.lockutils [None req-80a2a202-71d0-421b-93e2-fb77247d2728 tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Lock "c01ad807-a9a5-4028-baf4-0469a6301459" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.039s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1232.860599] env[62277]: DEBUG oslo_concurrency.lockutils [None req-890e0385-1650-49d4-a473-78033af541ec tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Lock "c01ad807-a9a5-4028-baf4-0469a6301459" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 1.049s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1232.860811] env[62277]: DEBUG oslo_concurrency.lockutils [None req-890e0385-1650-49d4-a473-78033af541ec tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Acquiring lock "c01ad807-a9a5-4028-baf4-0469a6301459-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1232.861016] env[62277]: DEBUG oslo_concurrency.lockutils [None req-890e0385-1650-49d4-a473-78033af541ec tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Lock "c01ad807-a9a5-4028-baf4-0469a6301459-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1232.862247] env[62277]: DEBUG oslo_concurrency.lockutils [None req-890e0385-1650-49d4-a473-78033af541ec tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Lock "c01ad807-a9a5-4028-baf4-0469a6301459-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1232.863917] env[62277]: INFO nova.compute.manager [None req-890e0385-1650-49d4-a473-78033af541ec tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Terminating instance [ 1232.867028] env[62277]: DEBUG oslo_concurrency.lockutils [None req-890e0385-1650-49d4-a473-78033af541ec tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Acquiring lock "refresh_cache-c01ad807-a9a5-4028-baf4-0469a6301459" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1232.867028] env[62277]: DEBUG oslo_concurrency.lockutils [None req-890e0385-1650-49d4-a473-78033af541ec tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Acquired lock "refresh_cache-c01ad807-a9a5-4028-baf4-0469a6301459" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1232.867028] env[62277]: DEBUG nova.network.neutron [None req-890e0385-1650-49d4-a473-78033af541ec tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1232.870257] env[62277]: DEBUG nova.compute.manager [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1232.921168] env[62277]: DEBUG oslo_concurrency.lockutils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1232.922024] env[62277]: DEBUG oslo_concurrency.lockutils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1232.923033] env[62277]: INFO nova.compute.claims [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1232.933616] env[62277]: DEBUG nova.network.neutron [None req-890e0385-1650-49d4-a473-78033af541ec tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1233.209117] env[62277]: DEBUG nova.network.neutron [None req-890e0385-1650-49d4-a473-78033af541ec tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1233.223706] env[62277]: DEBUG oslo_concurrency.lockutils [None req-890e0385-1650-49d4-a473-78033af541ec tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Releasing lock "refresh_cache-c01ad807-a9a5-4028-baf4-0469a6301459" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1233.224211] env[62277]: DEBUG nova.compute.manager [None req-890e0385-1650-49d4-a473-78033af541ec tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1233.224518] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-890e0385-1650-49d4-a473-78033af541ec tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1233.225197] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b12e9996-d0e5-4c5b-9509-080f720e83df {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1233.240738] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0717b26-1e10-4010-aba5-88965d5119ee {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1233.274639] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-890e0385-1650-49d4-a473-78033af541ec tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c01ad807-a9a5-4028-baf4-0469a6301459 could not be found. [ 1233.274921] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-890e0385-1650-49d4-a473-78033af541ec tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1233.277355] env[62277]: INFO nova.compute.manager [None req-890e0385-1650-49d4-a473-78033af541ec tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1233.277355] env[62277]: DEBUG oslo.service.loopingcall [None req-890e0385-1650-49d4-a473-78033af541ec tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1233.280798] env[62277]: DEBUG nova.compute.manager [-] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1233.280798] env[62277]: DEBUG nova.network.neutron [-] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1233.304350] env[62277]: DEBUG nova.network.neutron [-] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1233.330160] env[62277]: DEBUG nova.network.neutron [-] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1233.356931] env[62277]: INFO nova.compute.manager [-] [instance: c01ad807-a9a5-4028-baf4-0469a6301459] Took 0.08 seconds to deallocate network for instance. [ 1233.374788] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd87cdc6-3598-4cfc-9b07-e973f4c930e5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1233.385675] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28c9bd0d-19a6-4524-9a33-a0e464f9e4f4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1233.423376] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eec3d99b-43d7-4221-8ecb-95aff2b05aa1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1233.438904] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e98d7209-24cc-478f-b971-ba44c9b195cf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1233.457018] env[62277]: DEBUG nova.compute.provider_tree [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1233.470091] env[62277]: DEBUG nova.scheduler.client.report [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1233.497982] env[62277]: DEBUG oslo_concurrency.lockutils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.576s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1233.498515] env[62277]: DEBUG nova.compute.manager [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1233.505798] env[62277]: DEBUG oslo_concurrency.lockutils [None req-890e0385-1650-49d4-a473-78033af541ec tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Lock "c01ad807-a9a5-4028-baf4-0469a6301459" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.645s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1233.536697] env[62277]: DEBUG nova.compute.utils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1233.537942] env[62277]: DEBUG nova.compute.manager [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1233.538141] env[62277]: DEBUG nova.network.neutron [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1233.546628] env[62277]: DEBUG nova.compute.manager [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1233.626655] env[62277]: DEBUG nova.compute.manager [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1233.632202] env[62277]: DEBUG nova.policy [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '926425950a154e14947c3eced1b9457c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8f7d02fcf2840ea9a952c2c1c7a0cb8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1233.658051] env[62277]: DEBUG nova.virt.hardware [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1233.658328] env[62277]: DEBUG nova.virt.hardware [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1233.658484] env[62277]: DEBUG nova.virt.hardware [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1233.658663] env[62277]: DEBUG nova.virt.hardware [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1233.658803] env[62277]: DEBUG nova.virt.hardware [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1233.658941] env[62277]: DEBUG nova.virt.hardware [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1233.659278] env[62277]: DEBUG nova.virt.hardware [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1233.659448] env[62277]: DEBUG nova.virt.hardware [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1233.659616] env[62277]: DEBUG nova.virt.hardware [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1233.659777] env[62277]: DEBUG nova.virt.hardware [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1233.659941] env[62277]: DEBUG nova.virt.hardware [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1233.660803] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00e62a9a-5684-41c7-a4f9-47e8c50465c2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1233.669708] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6534491-7952-4bf0-ae1f-d6a2462a0943 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1234.618904] env[62277]: DEBUG nova.network.neutron [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Successfully created port: 03b34603-9b85-4c6d-a21e-ae98500b8e79 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1236.128787] env[62277]: DEBUG nova.network.neutron [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Successfully updated port: 03b34603-9b85-4c6d-a21e-ae98500b8e79 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1236.180835] env[62277]: DEBUG oslo_concurrency.lockutils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Acquiring lock "refresh_cache-1e8429b2-7149-4832-8590-e0ebd8501176" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1236.180941] env[62277]: DEBUG oslo_concurrency.lockutils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Acquired lock "refresh_cache-1e8429b2-7149-4832-8590-e0ebd8501176" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1236.181109] env[62277]: DEBUG nova.network.neutron [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1236.237164] env[62277]: DEBUG nova.network.neutron [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1236.496933] env[62277]: DEBUG nova.network.neutron [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Updating instance_info_cache with network_info: [{"id": "03b34603-9b85-4c6d-a21e-ae98500b8e79", "address": "fa:16:3e:ce:b7:e1", "network": {"id": "ec63c802-6c72-4140-9ee7-7c32657a966b", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1854317328-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e8f7d02fcf2840ea9a952c2c1c7a0cb8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "983826cf-6390-4ec6-bf97-30a1060947fc", "external-id": "nsx-vlan-transportzone-367", "segmentation_id": 367, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap03b34603-9b", "ovs_interfaceid": "03b34603-9b85-4c6d-a21e-ae98500b8e79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1236.512325] env[62277]: DEBUG oslo_concurrency.lockutils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Releasing lock "refresh_cache-1e8429b2-7149-4832-8590-e0ebd8501176" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1236.512628] env[62277]: DEBUG nova.compute.manager [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Instance network_info: |[{"id": "03b34603-9b85-4c6d-a21e-ae98500b8e79", "address": "fa:16:3e:ce:b7:e1", "network": {"id": "ec63c802-6c72-4140-9ee7-7c32657a966b", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1854317328-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e8f7d02fcf2840ea9a952c2c1c7a0cb8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "983826cf-6390-4ec6-bf97-30a1060947fc", "external-id": "nsx-vlan-transportzone-367", "segmentation_id": 367, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap03b34603-9b", "ovs_interfaceid": "03b34603-9b85-4c6d-a21e-ae98500b8e79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1236.513051] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ce:b7:e1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '983826cf-6390-4ec6-bf97-30a1060947fc', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '03b34603-9b85-4c6d-a21e-ae98500b8e79', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1236.520421] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Creating folder: Project (e8f7d02fcf2840ea9a952c2c1c7a0cb8). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1236.521015] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a3dd2310-d7f6-45a8-962c-5038a9b91429 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1236.532255] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Created folder: Project (e8f7d02fcf2840ea9a952c2c1c7a0cb8) in parent group-v297781. [ 1236.532443] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Creating folder: Instances. Parent ref: group-v297834. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1236.532687] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-101b9358-a0de-4b21-b839-8ae202c7acad {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1236.541860] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Created folder: Instances in parent group-v297834. [ 1236.542094] env[62277]: DEBUG oslo.service.loopingcall [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1236.542274] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1236.542468] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a594ccc5-adff-435a-8c63-535849e626ce {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1236.561552] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1236.561552] env[62277]: value = "task-1405377" [ 1236.561552] env[62277]: _type = "Task" [ 1236.561552] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1236.569121] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405377, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1236.724325] env[62277]: DEBUG nova.compute.manager [req-f08e2820-ba4e-4644-a0da-0f6f27285531 req-e7c9e1c0-d2b8-4f29-9943-3cf643f6e1e6 service nova] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Received event network-vif-plugged-03b34603-9b85-4c6d-a21e-ae98500b8e79 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1236.724563] env[62277]: DEBUG oslo_concurrency.lockutils [req-f08e2820-ba4e-4644-a0da-0f6f27285531 req-e7c9e1c0-d2b8-4f29-9943-3cf643f6e1e6 service nova] Acquiring lock "1e8429b2-7149-4832-8590-e0ebd8501176-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1236.724789] env[62277]: DEBUG oslo_concurrency.lockutils [req-f08e2820-ba4e-4644-a0da-0f6f27285531 req-e7c9e1c0-d2b8-4f29-9943-3cf643f6e1e6 service nova] Lock "1e8429b2-7149-4832-8590-e0ebd8501176-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1236.724964] env[62277]: DEBUG oslo_concurrency.lockutils [req-f08e2820-ba4e-4644-a0da-0f6f27285531 req-e7c9e1c0-d2b8-4f29-9943-3cf643f6e1e6 service nova] Lock "1e8429b2-7149-4832-8590-e0ebd8501176-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1236.726662] env[62277]: DEBUG nova.compute.manager [req-f08e2820-ba4e-4644-a0da-0f6f27285531 req-e7c9e1c0-d2b8-4f29-9943-3cf643f6e1e6 service nova] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] No waiting events found dispatching network-vif-plugged-03b34603-9b85-4c6d-a21e-ae98500b8e79 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1236.727197] env[62277]: WARNING nova.compute.manager [req-f08e2820-ba4e-4644-a0da-0f6f27285531 req-e7c9e1c0-d2b8-4f29-9943-3cf643f6e1e6 service nova] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Received unexpected event network-vif-plugged-03b34603-9b85-4c6d-a21e-ae98500b8e79 for instance with vm_state building and task_state spawning. [ 1237.071688] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405377, 'name': CreateVM_Task, 'duration_secs': 0.302245} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1237.071859] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1237.073018] env[62277]: DEBUG oslo_concurrency.lockutils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1237.073018] env[62277]: DEBUG oslo_concurrency.lockutils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1237.073018] env[62277]: DEBUG oslo_concurrency.lockutils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1237.073322] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4551058d-336d-46f1-8338-92308f54ecbf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.077756] env[62277]: DEBUG oslo_vmware.api [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Waiting for the task: (returnval){ [ 1237.077756] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52e3108e-e273-2c9a-4d9b-9d6ddb7a1f88" [ 1237.077756] env[62277]: _type = "Task" [ 1237.077756] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1237.087047] env[62277]: DEBUG oslo_vmware.api [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52e3108e-e273-2c9a-4d9b-9d6ddb7a1f88, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1237.593108] env[62277]: DEBUG oslo_concurrency.lockutils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1237.593486] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1237.593622] env[62277]: DEBUG oslo_concurrency.lockutils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1239.002457] env[62277]: DEBUG oslo_concurrency.lockutils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Acquiring lock "8d00162c-7379-48b6-841b-f802db2582db" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1239.002787] env[62277]: DEBUG oslo_concurrency.lockutils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Lock "8d00162c-7379-48b6-841b-f802db2582db" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1239.699218] env[62277]: DEBUG nova.compute.manager [req-3223af69-28bc-4a83-8c74-ccb56382e23f req-085b6ea5-f2ce-416f-9ef6-1143718e463e service nova] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Received event network-changed-03b34603-9b85-4c6d-a21e-ae98500b8e79 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1239.699488] env[62277]: DEBUG nova.compute.manager [req-3223af69-28bc-4a83-8c74-ccb56382e23f req-085b6ea5-f2ce-416f-9ef6-1143718e463e service nova] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Refreshing instance network info cache due to event network-changed-03b34603-9b85-4c6d-a21e-ae98500b8e79. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1239.699683] env[62277]: DEBUG oslo_concurrency.lockutils [req-3223af69-28bc-4a83-8c74-ccb56382e23f req-085b6ea5-f2ce-416f-9ef6-1143718e463e service nova] Acquiring lock "refresh_cache-1e8429b2-7149-4832-8590-e0ebd8501176" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1239.699760] env[62277]: DEBUG oslo_concurrency.lockutils [req-3223af69-28bc-4a83-8c74-ccb56382e23f req-085b6ea5-f2ce-416f-9ef6-1143718e463e service nova] Acquired lock "refresh_cache-1e8429b2-7149-4832-8590-e0ebd8501176" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1239.699916] env[62277]: DEBUG nova.network.neutron [req-3223af69-28bc-4a83-8c74-ccb56382e23f req-085b6ea5-f2ce-416f-9ef6-1143718e463e service nova] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Refreshing network info cache for port 03b34603-9b85-4c6d-a21e-ae98500b8e79 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1239.848292] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0d29b2f9-dc86-48fa-abac-7c0ed78c0e4e tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Acquiring lock "1e8429b2-7149-4832-8590-e0ebd8501176" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1240.069329] env[62277]: DEBUG nova.network.neutron [req-3223af69-28bc-4a83-8c74-ccb56382e23f req-085b6ea5-f2ce-416f-9ef6-1143718e463e service nova] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Updated VIF entry in instance network info cache for port 03b34603-9b85-4c6d-a21e-ae98500b8e79. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1240.069668] env[62277]: DEBUG nova.network.neutron [req-3223af69-28bc-4a83-8c74-ccb56382e23f req-085b6ea5-f2ce-416f-9ef6-1143718e463e service nova] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Updating instance_info_cache with network_info: [{"id": "03b34603-9b85-4c6d-a21e-ae98500b8e79", "address": "fa:16:3e:ce:b7:e1", "network": {"id": "ec63c802-6c72-4140-9ee7-7c32657a966b", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1854317328-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e8f7d02fcf2840ea9a952c2c1c7a0cb8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "983826cf-6390-4ec6-bf97-30a1060947fc", "external-id": "nsx-vlan-transportzone-367", "segmentation_id": 367, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap03b34603-9b", "ovs_interfaceid": "03b34603-9b85-4c6d-a21e-ae98500b8e79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1240.089244] env[62277]: DEBUG oslo_concurrency.lockutils [req-3223af69-28bc-4a83-8c74-ccb56382e23f req-085b6ea5-f2ce-416f-9ef6-1143718e463e service nova] Releasing lock "refresh_cache-1e8429b2-7149-4832-8590-e0ebd8501176" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1244.252353] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e92c5a32-c7c3-4b88-b3b2-36480963b926 tempest-ServerRescueTestJSON-2122562855 tempest-ServerRescueTestJSON-2122562855-project-member] Acquiring lock "c8d02374-bed2-4b4a-9bab-3a3dec87ad3e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1244.252632] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e92c5a32-c7c3-4b88-b3b2-36480963b926 tempest-ServerRescueTestJSON-2122562855 tempest-ServerRescueTestJSON-2122562855-project-member] Lock "c8d02374-bed2-4b4a-9bab-3a3dec87ad3e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1248.071148] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_power_states {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1248.104387] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Getting list of instances from cluster (obj){ [ 1248.104387] env[62277]: value = "domain-c8" [ 1248.104387] env[62277]: _type = "ClusterComputeResource" [ 1248.104387] env[62277]: } {{(pid=62277) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1248.107625] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61e7fcc4-9133-4454-9799-9423f504bf2b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1248.134080] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Got total of 10 instances {{(pid=62277) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1248.134545] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid dfc291fd-1481-4e76-9fb3-ec87124c1281 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1248.134545] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 36ff1435-1999-4e95-8920-81a1b25cc452 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1248.134656] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 68925f1b-da69-4955-acb1-d6500b03daee {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1248.134937] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 866c4415-caab-4d81-86ba-ed662feb3c4f {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1248.135243] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 350e2302-66b9-4dd6-b0f4-77000992408b {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1248.135337] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 23bc5a48-3e96-4897-bf28-ad14a0bdde62 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1248.136035] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 346748bd-b4e8-4e93-b71d-66c90a45e372 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1248.136125] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 930ff058-ab48-4c8a-8f5e-4820a3b12d50 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1248.136336] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1248.136533] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 1e8429b2-7149-4832-8590-e0ebd8501176 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1248.137011] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "dfc291fd-1481-4e76-9fb3-ec87124c1281" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1248.139022] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "36ff1435-1999-4e95-8920-81a1b25cc452" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1248.139022] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "68925f1b-da69-4955-acb1-d6500b03daee" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1248.139022] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "866c4415-caab-4d81-86ba-ed662feb3c4f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1248.139022] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "350e2302-66b9-4dd6-b0f4-77000992408b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1248.139022] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "23bc5a48-3e96-4897-bf28-ad14a0bdde62" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1248.139022] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "346748bd-b4e8-4e93-b71d-66c90a45e372" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1248.139022] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "930ff058-ab48-4c8a-8f5e-4820a3b12d50" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1248.139022] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1248.139680] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "1e8429b2-7149-4832-8590-e0ebd8501176" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1248.408125] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1548f96a-4393-43bc-95ad-0f75a08c9eaa tempest-ServersV294TestFqdnHostnames-1549436271 tempest-ServersV294TestFqdnHostnames-1549436271-project-member] Acquiring lock "df611bf9-45db-4940-a59e-fccc7d96b935" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1248.408425] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1548f96a-4393-43bc-95ad-0f75a08c9eaa tempest-ServersV294TestFqdnHostnames-1549436271 tempest-ServersV294TestFqdnHostnames-1549436271-project-member] Lock "df611bf9-45db-4940-a59e-fccc7d96b935" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1251.094478] env[62277]: DEBUG oslo_concurrency.lockutils [None req-eaa6dcfc-5df7-4bce-bf02-13d827a6d37e tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Acquiring lock "5d595f5e-6d35-4c89-a4e2-a3639c6145c8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1251.094788] env[62277]: DEBUG oslo_concurrency.lockutils [None req-eaa6dcfc-5df7-4bce-bf02-13d827a6d37e tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Lock "5d595f5e-6d35-4c89-a4e2-a3639c6145c8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1252.893369] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02618745-f64d-4b68-821f-0b1b3757a349 tempest-ServersTestManualDisk-209415039 tempest-ServersTestManualDisk-209415039-project-member] Acquiring lock "4fd54f91-dedd-4ce2-8acf-8a2123be73b8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1252.893369] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02618745-f64d-4b68-821f-0b1b3757a349 tempest-ServersTestManualDisk-209415039 tempest-ServersTestManualDisk-209415039-project-member] Lock "4fd54f91-dedd-4ce2-8acf-8a2123be73b8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1255.488897] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2ac43584-bf89-4422-91e8-9ba1f88bac13 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "baabe4ee-b366-45a8-bf06-cd63f697e7dc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1255.489897] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2ac43584-bf89-4422-91e8-9ba1f88bac13 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "baabe4ee-b366-45a8-bf06-cd63f697e7dc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1260.654072] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3fee9975-0813-4582-a552-f679545608ec tempest-ServerShowV257Test-724719980 tempest-ServerShowV257Test-724719980-project-member] Acquiring lock "5cf06245-3fa1-4596-8260-7a82bc4a1193" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1260.654406] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3fee9975-0813-4582-a552-f679545608ec tempest-ServerShowV257Test-724719980 tempest-ServerShowV257Test-724719980-project-member] Lock "5cf06245-3fa1-4596-8260-7a82bc4a1193" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1268.768602] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6f620688-b0e0-4890-aa07-12e4e3b48735 tempest-ListServerFiltersTestJSON-1537632818 tempest-ListServerFiltersTestJSON-1537632818-project-member] Acquiring lock "308e5c48-c452-4dbf-94b0-1eb12951e620" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1268.769012] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6f620688-b0e0-4890-aa07-12e4e3b48735 tempest-ListServerFiltersTestJSON-1537632818 tempest-ListServerFiltersTestJSON-1537632818-project-member] Lock "308e5c48-c452-4dbf-94b0-1eb12951e620" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1269.185730] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cfaa7f1e-b1cf-452b-8390-d89b308ccd82 tempest-ListServerFiltersTestJSON-1537632818 tempest-ListServerFiltersTestJSON-1537632818-project-member] Acquiring lock "39172747-1245-473d-9f18-87bae208b5b1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1269.185961] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cfaa7f1e-b1cf-452b-8390-d89b308ccd82 tempest-ListServerFiltersTestJSON-1537632818 tempest-ListServerFiltersTestJSON-1537632818-project-member] Lock "39172747-1245-473d-9f18-87bae208b5b1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1269.618084] env[62277]: DEBUG oslo_concurrency.lockutils [None req-d087fe8a-ba45-4e7c-8fe4-fdab8a5b8226 tempest-ListServerFiltersTestJSON-1537632818 tempest-ListServerFiltersTestJSON-1537632818-project-member] Acquiring lock "7ecfbbee-4955-4704-af62-ce8f5470cfbe" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1269.618726] env[62277]: DEBUG oslo_concurrency.lockutils [None req-d087fe8a-ba45-4e7c-8fe4-fdab8a5b8226 tempest-ListServerFiltersTestJSON-1537632818 tempest-ListServerFiltersTestJSON-1537632818-project-member] Lock "7ecfbbee-4955-4704-af62-ce8f5470cfbe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1276.238743] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1277.168900] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1277.169091] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 1277.169218] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 1277.191465] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1277.191616] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1277.191742] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1277.191870] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1277.192022] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1277.192168] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1277.192290] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1277.192404] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1277.192514] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1277.192624] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1277.192739] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 1277.193230] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1279.168047] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1279.180666] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1279.180888] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1279.181075] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1279.181234] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1279.182697] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24971847-5dd9-4475-b44f-5c1176f0ad5e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.188464] env[62277]: WARNING oslo_vmware.rw_handles [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1279.188464] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1279.188464] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1279.188464] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1279.188464] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1279.188464] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1279.188464] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1279.188464] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1279.188464] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1279.188464] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1279.188464] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1279.188464] env[62277]: ERROR oslo_vmware.rw_handles [ 1279.189046] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/0271724a-606f-404a-b072-f17e15265de5/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1279.191468] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1279.191818] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Copying Virtual Disk [datastore2] vmware_temp/0271724a-606f-404a-b072-f17e15265de5/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/0271724a-606f-404a-b072-f17e15265de5/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1279.192363] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-caf8666d-cc4e-4326-b5df-c4b63f5a0cf0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.198122] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4065c5a9-51b7-4ed2-b65d-15c20281eea8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.203225] env[62277]: DEBUG oslo_vmware.api [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Waiting for the task: (returnval){ [ 1279.203225] env[62277]: value = "task-1405378" [ 1279.203225] env[62277]: _type = "Task" [ 1279.203225] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1279.215272] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-821248aa-8151-4865-ac44-633b18c9fbee {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.220733] env[62277]: DEBUG oslo_vmware.api [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Task: {'id': task-1405378, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1279.224888] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5eb673f5-a38b-47c9-b083-07801be33fef {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.255764] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181447MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1279.255822] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1279.256026] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1279.331139] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance dfc291fd-1481-4e76-9fb3-ec87124c1281 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1279.331359] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 36ff1435-1999-4e95-8920-81a1b25cc452 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1279.331437] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 68925f1b-da69-4955-acb1-d6500b03daee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1279.331584] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 866c4415-caab-4d81-86ba-ed662feb3c4f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1279.331704] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 350e2302-66b9-4dd6-b0f4-77000992408b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1279.331836] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 23bc5a48-3e96-4897-bf28-ad14a0bdde62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1279.331957] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 346748bd-b4e8-4e93-b71d-66c90a45e372 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1279.332088] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 930ff058-ab48-4c8a-8f5e-4820a3b12d50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1279.332222] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1279.332339] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 1e8429b2-7149-4832-8590-e0ebd8501176 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1279.345347] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 21bd4623-2b46-43c4-859f-c4d3bf261e1f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1279.363487] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6d759045-e1fc-43ea-a882-1ead769b6d29 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1279.374488] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 32bed248-06d5-47a1-b281-47921d99dbf6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1279.387242] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance e573e784-3318-4a41-89fd-40cbe8749413 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1279.398757] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 06571cd1-61a1-48e9-a204-624d5f383ad3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1279.409177] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8d00162c-7379-48b6-841b-f802db2582db has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1279.419433] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c8d02374-bed2-4b4a-9bab-3a3dec87ad3e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1279.430123] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance df611bf9-45db-4940-a59e-fccc7d96b935 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1279.439693] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 5d595f5e-6d35-4c89-a4e2-a3639c6145c8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1279.449377] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4fd54f91-dedd-4ce2-8acf-8a2123be73b8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1279.458538] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance baabe4ee-b366-45a8-bf06-cd63f697e7dc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1279.468902] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 5cf06245-3fa1-4596-8260-7a82bc4a1193 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1279.478434] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 308e5c48-c452-4dbf-94b0-1eb12951e620 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1279.488015] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 39172747-1245-473d-9f18-87bae208b5b1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1279.497930] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 7ecfbbee-4955-4704-af62-ce8f5470cfbe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1279.498221] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1279.498372] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1279.715273] env[62277]: DEBUG oslo_vmware.exceptions [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1279.715633] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1279.716419] env[62277]: ERROR nova.compute.manager [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1279.716419] env[62277]: Faults: ['InvalidArgument'] [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Traceback (most recent call last): [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] yield resources [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] self.driver.spawn(context, instance, image_meta, [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] self._fetch_image_if_missing(context, vi) [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] image_cache(vi, tmp_image_ds_loc) [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] vm_util.copy_virtual_disk( [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] session._wait_for_task(vmdk_copy_task) [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] return self.wait_for_task(task_ref) [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] return evt.wait() [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] result = hub.switch() [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] return self.greenlet.switch() [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] self.f(*self.args, **self.kw) [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] raise exceptions.translate_fault(task_info.error) [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Faults: ['InvalidArgument'] [ 1279.716419] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] [ 1279.717413] env[62277]: INFO nova.compute.manager [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Terminating instance [ 1279.719366] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1279.720074] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1279.721358] env[62277]: DEBUG nova.compute.manager [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1279.721358] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1279.724280] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-84a19ded-f70c-4f77-b1b4-2cd90a857766 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.728466] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3544a572-8ef4-440b-a1a6-cff185115587 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.736213] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1279.736213] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3fd665c9-3bd4-4d90-8fae-59361ac6aba5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.738185] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1279.738349] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1279.739311] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b4eb1bf7-efc1-4b1c-ab6f-bf2156d3ccb8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.747098] env[62277]: DEBUG oslo_vmware.api [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Waiting for the task: (returnval){ [ 1279.747098] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]526e50a5-d940-afc3-1b1b-e856cbd67d5a" [ 1279.747098] env[62277]: _type = "Task" [ 1279.747098] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1279.762375] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1279.762375] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Creating directory with path [datastore2] vmware_temp/85b4e6f8-64d9-4e6a-b90c-828384a37a3c/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1279.762375] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b2602b29-e4b0-4861-ba02-a4a91fc594c1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.785220] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Created directory with path [datastore2] vmware_temp/85b4e6f8-64d9-4e6a-b90c-828384a37a3c/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1279.785449] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Fetch image to [datastore2] vmware_temp/85b4e6f8-64d9-4e6a-b90c-828384a37a3c/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1279.785626] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/85b4e6f8-64d9-4e6a-b90c-828384a37a3c/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1279.786435] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71ab5d42-5ff3-4600-92e1-8725ed0cd4ad {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.794421] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72421c32-4b05-4efd-91c1-3ea256afc1bd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.804030] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca1899ce-8fde-4c2a-b8bd-bf6a9f5c7971 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.815849] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1279.815849] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1279.815849] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Deleting the datastore file [datastore2] dfc291fd-1481-4e76-9fb3-ec87124c1281 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1279.815849] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b0c7473c-9e61-4f87-805e-7954763cad84 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.842521] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b96782a-3390-45e7-8e84-4dad1da2c9db {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.845865] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d6cc37f-bd82-497b-9b08-d816871417f1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.850532] env[62277]: DEBUG oslo_vmware.api [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Waiting for the task: (returnval){ [ 1279.850532] env[62277]: value = "task-1405380" [ 1279.850532] env[62277]: _type = "Task" [ 1279.850532] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1279.858769] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6b52697c-bfab-474d-b27b-47fe34ed3cfe {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.865433] env[62277]: DEBUG oslo_vmware.api [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Task: {'id': task-1405380, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1279.866843] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2275e77-6544-445e-b4e9-5c7efb7b4dc1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.898453] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2df083ee-1c6e-4477-8a97-4d85de9389af {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.901260] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1279.908168] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aebfb827-7e44-4526-8dc2-d149b7422852 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1279.924240] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1279.932904] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1279.947686] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1279.947878] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.692s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1279.966875] env[62277]: DEBUG oslo_vmware.rw_handles [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/85b4e6f8-64d9-4e6a-b90c-828384a37a3c/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1280.030798] env[62277]: DEBUG oslo_vmware.rw_handles [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1280.031015] env[62277]: DEBUG oslo_vmware.rw_handles [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/85b4e6f8-64d9-4e6a-b90c-828384a37a3c/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1280.362392] env[62277]: DEBUG oslo_vmware.api [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Task: {'id': task-1405380, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068465} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1280.362683] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1280.362845] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1280.363019] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1280.363194] env[62277]: INFO nova.compute.manager [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Took 0.64 seconds to destroy the instance on the hypervisor. [ 1280.365336] env[62277]: DEBUG nova.compute.claims [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1280.365544] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1280.365712] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1280.700255] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-936ab8a6-9581-4694-b62d-2a811b3d90bd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1280.708939] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a03380b9-e999-4f07-ac80-1cff24b35039 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1280.739272] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b25c5764-f671-48a4-b4ab-f4af679abb20 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1280.747065] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6167c3b-5db6-445f-b21c-d39114351e9a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1280.760204] env[62277]: DEBUG nova.compute.provider_tree [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1280.769244] env[62277]: DEBUG nova.scheduler.client.report [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1280.787473] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.422s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1280.788026] env[62277]: ERROR nova.compute.manager [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1280.788026] env[62277]: Faults: ['InvalidArgument'] [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Traceback (most recent call last): [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] self.driver.spawn(context, instance, image_meta, [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] self._fetch_image_if_missing(context, vi) [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] image_cache(vi, tmp_image_ds_loc) [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] vm_util.copy_virtual_disk( [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] session._wait_for_task(vmdk_copy_task) [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] return self.wait_for_task(task_ref) [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] return evt.wait() [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] result = hub.switch() [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] return self.greenlet.switch() [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] self.f(*self.args, **self.kw) [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] raise exceptions.translate_fault(task_info.error) [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Faults: ['InvalidArgument'] [ 1280.788026] env[62277]: ERROR nova.compute.manager [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] [ 1280.790030] env[62277]: DEBUG nova.compute.utils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1280.793709] env[62277]: DEBUG nova.compute.manager [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Build of instance dfc291fd-1481-4e76-9fb3-ec87124c1281 was re-scheduled: A specified parameter was not correct: fileType [ 1280.793709] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1280.794203] env[62277]: DEBUG nova.compute.manager [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1280.794203] env[62277]: DEBUG nova.compute.manager [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1280.794331] env[62277]: DEBUG nova.compute.manager [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1280.794668] env[62277]: DEBUG nova.network.neutron [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1280.943321] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1280.943693] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1280.943875] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1281.185562] env[62277]: DEBUG nova.network.neutron [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1281.198845] env[62277]: INFO nova.compute.manager [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Took 0.40 seconds to deallocate network for instance. [ 1281.323032] env[62277]: INFO nova.scheduler.client.report [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Deleted allocations for instance dfc291fd-1481-4e76-9fb3-ec87124c1281 [ 1281.349491] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db368363-3aec-44b2-aa8c-0f56c8d00890 tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Lock "dfc291fd-1481-4e76-9fb3-ec87124c1281" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 286.201s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1281.350773] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f6de3ad1-5655-4014-bb0c-f1bdc9de330e tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Lock "dfc291fd-1481-4e76-9fb3-ec87124c1281" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 85.955s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1281.350991] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f6de3ad1-5655-4014-bb0c-f1bdc9de330e tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Acquiring lock "dfc291fd-1481-4e76-9fb3-ec87124c1281-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1281.351212] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f6de3ad1-5655-4014-bb0c-f1bdc9de330e tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Lock "dfc291fd-1481-4e76-9fb3-ec87124c1281-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1281.351377] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f6de3ad1-5655-4014-bb0c-f1bdc9de330e tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Lock "dfc291fd-1481-4e76-9fb3-ec87124c1281-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1281.354018] env[62277]: INFO nova.compute.manager [None req-f6de3ad1-5655-4014-bb0c-f1bdc9de330e tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Terminating instance [ 1281.355856] env[62277]: DEBUG nova.compute.manager [None req-f6de3ad1-5655-4014-bb0c-f1bdc9de330e tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1281.356062] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-f6de3ad1-5655-4014-bb0c-f1bdc9de330e tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1281.356829] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2b41068e-8faf-4233-bd0e-41a8e9e44b99 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1281.365358] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bdb47d0-276e-4fc9-8098-f2d189104636 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1281.376496] env[62277]: DEBUG nova.compute.manager [None req-8efded13-d761-4957-905d-83cb75b63cfd tempest-ServerDiagnosticsTest-825586886 tempest-ServerDiagnosticsTest-825586886-project-member] [instance: f646a534-1ae8-40dd-9819-3d71bda87ae2] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1281.396098] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-f6de3ad1-5655-4014-bb0c-f1bdc9de330e tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance dfc291fd-1481-4e76-9fb3-ec87124c1281 could not be found. [ 1281.396396] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-f6de3ad1-5655-4014-bb0c-f1bdc9de330e tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1281.396493] env[62277]: INFO nova.compute.manager [None req-f6de3ad1-5655-4014-bb0c-f1bdc9de330e tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1281.396746] env[62277]: DEBUG oslo.service.loopingcall [None req-f6de3ad1-5655-4014-bb0c-f1bdc9de330e tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1281.396980] env[62277]: DEBUG nova.compute.manager [-] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1281.397088] env[62277]: DEBUG nova.network.neutron [-] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1281.403573] env[62277]: DEBUG nova.compute.manager [None req-8efded13-d761-4957-905d-83cb75b63cfd tempest-ServerDiagnosticsTest-825586886 tempest-ServerDiagnosticsTest-825586886-project-member] [instance: f646a534-1ae8-40dd-9819-3d71bda87ae2] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1281.427488] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8efded13-d761-4957-905d-83cb75b63cfd tempest-ServerDiagnosticsTest-825586886 tempest-ServerDiagnosticsTest-825586886-project-member] Lock "f646a534-1ae8-40dd-9819-3d71bda87ae2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 239.226s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1281.444253] env[62277]: DEBUG nova.compute.manager [None req-be1c341e-2f32-49d0-9112-018733d49685 tempest-InstanceActionsV221TestJSON-794172250 tempest-InstanceActionsV221TestJSON-794172250-project-member] [instance: 2b98866e-3c86-47bd-9eff-2c2743631563] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1281.481157] env[62277]: DEBUG nova.compute.manager [None req-be1c341e-2f32-49d0-9112-018733d49685 tempest-InstanceActionsV221TestJSON-794172250 tempest-InstanceActionsV221TestJSON-794172250-project-member] [instance: 2b98866e-3c86-47bd-9eff-2c2743631563] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1281.507398] env[62277]: DEBUG oslo_concurrency.lockutils [None req-be1c341e-2f32-49d0-9112-018733d49685 tempest-InstanceActionsV221TestJSON-794172250 tempest-InstanceActionsV221TestJSON-794172250-project-member] Lock "2b98866e-3c86-47bd-9eff-2c2743631563" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 239.107s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1281.519297] env[62277]: DEBUG nova.compute.manager [None req-5c153d52-0ee9-437c-910f-1755e96827aa tempest-ServerShowV254Test-1304854974 tempest-ServerShowV254Test-1304854974-project-member] [instance: 560b7750-03fe-4a4c-ab1d-a1751895986b] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1281.523244] env[62277]: DEBUG nova.network.neutron [-] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1281.538524] env[62277]: INFO nova.compute.manager [-] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] Took 0.14 seconds to deallocate network for instance. [ 1281.544301] env[62277]: DEBUG nova.compute.manager [None req-5c153d52-0ee9-437c-910f-1755e96827aa tempest-ServerShowV254Test-1304854974 tempest-ServerShowV254Test-1304854974-project-member] [instance: 560b7750-03fe-4a4c-ab1d-a1751895986b] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1281.562800] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c3e8c766-784b-47ca-8361-aea05dd9ff21 tempest-ServerExternalEventsTest-112581389 tempest-ServerExternalEventsTest-112581389-project-member] Acquiring lock "4ca037fc-9a4e-413b-9b4e-2122f5a4fe18" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1281.563049] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c3e8c766-784b-47ca-8361-aea05dd9ff21 tempest-ServerExternalEventsTest-112581389 tempest-ServerExternalEventsTest-112581389-project-member] Lock "4ca037fc-9a4e-413b-9b4e-2122f5a4fe18" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1281.577580] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5c153d52-0ee9-437c-910f-1755e96827aa tempest-ServerShowV254Test-1304854974 tempest-ServerShowV254Test-1304854974-project-member] Lock "560b7750-03fe-4a4c-ab1d-a1751895986b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 238.425s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1281.590761] env[62277]: DEBUG nova.compute.manager [None req-286b155a-78f0-4f14-9ee9-d6607e7a609d tempest-ImagesNegativeTestJSON-1671068824 tempest-ImagesNegativeTestJSON-1671068824-project-member] [instance: 86e8e8ba-e476-400d-b180-bb7df8a042d8] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1281.617857] env[62277]: DEBUG nova.compute.manager [None req-286b155a-78f0-4f14-9ee9-d6607e7a609d tempest-ImagesNegativeTestJSON-1671068824 tempest-ImagesNegativeTestJSON-1671068824-project-member] [instance: 86e8e8ba-e476-400d-b180-bb7df8a042d8] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1281.646859] env[62277]: DEBUG oslo_concurrency.lockutils [None req-286b155a-78f0-4f14-9ee9-d6607e7a609d tempest-ImagesNegativeTestJSON-1671068824 tempest-ImagesNegativeTestJSON-1671068824-project-member] Lock "86e8e8ba-e476-400d-b180-bb7df8a042d8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.229s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1281.648291] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f6de3ad1-5655-4014-bb0c-f1bdc9de330e tempest-ServersTestJSON-2037769424 tempest-ServersTestJSON-2037769424-project-member] Lock "dfc291fd-1481-4e76-9fb3-ec87124c1281" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.298s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1281.649075] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "dfc291fd-1481-4e76-9fb3-ec87124c1281" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 33.512s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1281.649247] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: dfc291fd-1481-4e76-9fb3-ec87124c1281] During sync_power_state the instance has a pending task (deleting). Skip. [ 1281.649884] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "dfc291fd-1481-4e76-9fb3-ec87124c1281" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1281.658117] env[62277]: DEBUG nova.compute.manager [None req-ad261aee-be38-472d-a103-72943d529497 tempest-ServersNegativeTestMultiTenantJSON-169553101 tempest-ServersNegativeTestMultiTenantJSON-169553101-project-member] [instance: c3e72352-f795-4ce7-9e0b-4e80c4329f7b] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1281.683335] env[62277]: DEBUG nova.compute.manager [None req-ad261aee-be38-472d-a103-72943d529497 tempest-ServersNegativeTestMultiTenantJSON-169553101 tempest-ServersNegativeTestMultiTenantJSON-169553101-project-member] [instance: c3e72352-f795-4ce7-9e0b-4e80c4329f7b] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1281.704511] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ad261aee-be38-472d-a103-72943d529497 tempest-ServersNegativeTestMultiTenantJSON-169553101 tempest-ServersNegativeTestMultiTenantJSON-169553101-project-member] Lock "c3e72352-f795-4ce7-9e0b-4e80c4329f7b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.661s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1281.714182] env[62277]: DEBUG nova.compute.manager [None req-832722ed-7844-4d20-8fab-7929c1c0c2ff tempest-ServerActionsTestOtherB-254108444 tempest-ServerActionsTestOtherB-254108444-project-member] [instance: b6908c32-5916-4a0e-92e2-21f480c5f7ca] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1281.739681] env[62277]: DEBUG nova.compute.manager [None req-832722ed-7844-4d20-8fab-7929c1c0c2ff tempest-ServerActionsTestOtherB-254108444 tempest-ServerActionsTestOtherB-254108444-project-member] [instance: b6908c32-5916-4a0e-92e2-21f480c5f7ca] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1281.761700] env[62277]: DEBUG oslo_concurrency.lockutils [None req-832722ed-7844-4d20-8fab-7929c1c0c2ff tempest-ServerActionsTestOtherB-254108444 tempest-ServerActionsTestOtherB-254108444-project-member] Lock "b6908c32-5916-4a0e-92e2-21f480c5f7ca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.201s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1281.770885] env[62277]: DEBUG nova.compute.manager [None req-267d2afe-de5d-4805-b2e9-11d8f2b57316 tempest-InstanceActionsNegativeTestJSON-286532833 tempest-InstanceActionsNegativeTestJSON-286532833-project-member] [instance: 27cd13ca-a17c-476e-a00a-cca1fe898763] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1281.802378] env[62277]: DEBUG nova.compute.manager [None req-267d2afe-de5d-4805-b2e9-11d8f2b57316 tempest-InstanceActionsNegativeTestJSON-286532833 tempest-InstanceActionsNegativeTestJSON-286532833-project-member] [instance: 27cd13ca-a17c-476e-a00a-cca1fe898763] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1281.826018] env[62277]: DEBUG oslo_concurrency.lockutils [None req-267d2afe-de5d-4805-b2e9-11d8f2b57316 tempest-InstanceActionsNegativeTestJSON-286532833 tempest-InstanceActionsNegativeTestJSON-286532833-project-member] Lock "27cd13ca-a17c-476e-a00a-cca1fe898763" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.426s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1281.836985] env[62277]: DEBUG nova.compute.manager [None req-8ca2c13e-fb65-4eb0-8646-8fef9a5fc83f tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: c000e183-2e57-470e-a9a5-30b5899e77c1] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1281.862870] env[62277]: DEBUG nova.compute.manager [None req-8ca2c13e-fb65-4eb0-8646-8fef9a5fc83f tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] [instance: c000e183-2e57-470e-a9a5-30b5899e77c1] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1281.885789] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ca2c13e-fb65-4eb0-8646-8fef9a5fc83f tempest-MigrationsAdminTest-1415859420 tempest-MigrationsAdminTest-1415859420-project-member] Lock "c000e183-2e57-470e-a9a5-30b5899e77c1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.680s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1281.896121] env[62277]: DEBUG nova.compute.manager [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1281.953538] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1281.953837] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1281.956491] env[62277]: INFO nova.compute.claims [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1282.168139] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1282.329594] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49de63bc-4902-40bb-be88-19dde885b7d9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1282.340290] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fee9263-9b1a-46d5-a964-be65168ffc3e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1282.374057] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d6d9900-b89e-49e6-a210-ca3545dd2de4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1282.385495] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc32f92e-c2f6-4bd1-9867-cc2bebd054f7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1282.396509] env[62277]: DEBUG nova.compute.provider_tree [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1282.409122] env[62277]: DEBUG nova.scheduler.client.report [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1282.434140] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.478s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1282.434140] env[62277]: DEBUG nova.compute.manager [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1282.474076] env[62277]: DEBUG nova.compute.utils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1282.474076] env[62277]: DEBUG nova.compute.manager [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1282.474440] env[62277]: DEBUG nova.network.neutron [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1282.486093] env[62277]: DEBUG nova.compute.manager [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1282.554299] env[62277]: DEBUG nova.compute.manager [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1282.644855] env[62277]: DEBUG nova.virt.hardware [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1282.645151] env[62277]: DEBUG nova.virt.hardware [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1282.645313] env[62277]: DEBUG nova.virt.hardware [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1282.645447] env[62277]: DEBUG nova.virt.hardware [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1282.645587] env[62277]: DEBUG nova.virt.hardware [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1282.645732] env[62277]: DEBUG nova.virt.hardware [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1282.646480] env[62277]: DEBUG nova.virt.hardware [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1282.646480] env[62277]: DEBUG nova.virt.hardware [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1282.646480] env[62277]: DEBUG nova.virt.hardware [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1282.646480] env[62277]: DEBUG nova.virt.hardware [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1282.646726] env[62277]: DEBUG nova.virt.hardware [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1282.647556] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5cad335-0224-45ea-ba5e-ce3886979226 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1282.655996] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0aadf2a9-9650-44ed-a861-c629a94d0190 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1282.681221] env[62277]: DEBUG nova.policy [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1f232a197cbc4094aa3b16f3ac856149', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0637edf123a14c9481b07ca6826d6456', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1283.139835] env[62277]: DEBUG nova.network.neutron [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Successfully created port: d120af2e-395f-461e-9161-49f964e75549 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1283.913998] env[62277]: DEBUG nova.compute.manager [req-94c5c277-35b2-49ad-afdc-5224701702a7 req-40d7bb94-a126-4f4f-b3d5-8fd845691382 service nova] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Received event network-vif-plugged-d120af2e-395f-461e-9161-49f964e75549 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1283.914300] env[62277]: DEBUG oslo_concurrency.lockutils [req-94c5c277-35b2-49ad-afdc-5224701702a7 req-40d7bb94-a126-4f4f-b3d5-8fd845691382 service nova] Acquiring lock "21bd4623-2b46-43c4-859f-c4d3bf261e1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1283.914463] env[62277]: DEBUG oslo_concurrency.lockutils [req-94c5c277-35b2-49ad-afdc-5224701702a7 req-40d7bb94-a126-4f4f-b3d5-8fd845691382 service nova] Lock "21bd4623-2b46-43c4-859f-c4d3bf261e1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1283.914624] env[62277]: DEBUG oslo_concurrency.lockutils [req-94c5c277-35b2-49ad-afdc-5224701702a7 req-40d7bb94-a126-4f4f-b3d5-8fd845691382 service nova] Lock "21bd4623-2b46-43c4-859f-c4d3bf261e1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1283.914785] env[62277]: DEBUG nova.compute.manager [req-94c5c277-35b2-49ad-afdc-5224701702a7 req-40d7bb94-a126-4f4f-b3d5-8fd845691382 service nova] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] No waiting events found dispatching network-vif-plugged-d120af2e-395f-461e-9161-49f964e75549 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1283.915170] env[62277]: WARNING nova.compute.manager [req-94c5c277-35b2-49ad-afdc-5224701702a7 req-40d7bb94-a126-4f4f-b3d5-8fd845691382 service nova] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Received unexpected event network-vif-plugged-d120af2e-395f-461e-9161-49f964e75549 for instance with vm_state building and task_state spawning. [ 1283.979582] env[62277]: DEBUG nova.network.neutron [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Successfully updated port: d120af2e-395f-461e-9161-49f964e75549 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1283.992051] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquiring lock "refresh_cache-21bd4623-2b46-43c4-859f-c4d3bf261e1f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1283.992251] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquired lock "refresh_cache-21bd4623-2b46-43c4-859f-c4d3bf261e1f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1283.992357] env[62277]: DEBUG nova.network.neutron [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1284.060247] env[62277]: DEBUG nova.network.neutron [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1284.304096] env[62277]: DEBUG nova.network.neutron [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Updating instance_info_cache with network_info: [{"id": "d120af2e-395f-461e-9161-49f964e75549", "address": "fa:16:3e:40:58:90", "network": {"id": "2c64e00f-4995-4827-b685-5096b1d1d064", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.178", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "63ed6630e9c140baa826f53d7a0564d1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd120af2e-39", "ovs_interfaceid": "d120af2e-395f-461e-9161-49f964e75549", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1284.323551] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Releasing lock "refresh_cache-21bd4623-2b46-43c4-859f-c4d3bf261e1f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1284.323911] env[62277]: DEBUG nova.compute.manager [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Instance network_info: |[{"id": "d120af2e-395f-461e-9161-49f964e75549", "address": "fa:16:3e:40:58:90", "network": {"id": "2c64e00f-4995-4827-b685-5096b1d1d064", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.178", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "63ed6630e9c140baa826f53d7a0564d1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd120af2e-39", "ovs_interfaceid": "d120af2e-395f-461e-9161-49f964e75549", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1284.324389] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:40:58:90', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '77aa121f-8fb6-42f3-aaea-43addfe449b2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd120af2e-395f-461e-9161-49f964e75549', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1284.332585] env[62277]: DEBUG oslo.service.loopingcall [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1284.333172] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1284.333430] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-966164df-9719-4cae-b8ad-bd740744f898 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1284.355145] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1284.355145] env[62277]: value = "task-1405381" [ 1284.355145] env[62277]: _type = "Task" [ 1284.355145] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1284.363835] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405381, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1284.868294] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405381, 'name': CreateVM_Task, 'duration_secs': 0.304822} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1284.868294] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1284.868497] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1284.869096] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1284.869459] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1284.869719] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-aa9b8bdb-f4c8-4a19-9c54-e797b1a2f664 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1284.874354] env[62277]: DEBUG oslo_vmware.api [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Waiting for the task: (returnval){ [ 1284.874354] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52011be4-4437-14a7-62c6-abb282fcf228" [ 1284.874354] env[62277]: _type = "Task" [ 1284.874354] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1284.882216] env[62277]: DEBUG oslo_vmware.api [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52011be4-4437-14a7-62c6-abb282fcf228, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1285.168606] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1285.168870] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 1285.385645] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1285.386216] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1285.386671] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1286.033544] env[62277]: DEBUG nova.compute.manager [req-ec3e4b29-ec5b-4391-82b7-ce3a34ac8e28 req-b48ed4a9-64eb-4327-974a-2af17e88f185 service nova] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Received event network-changed-d120af2e-395f-461e-9161-49f964e75549 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1286.033808] env[62277]: DEBUG nova.compute.manager [req-ec3e4b29-ec5b-4391-82b7-ce3a34ac8e28 req-b48ed4a9-64eb-4327-974a-2af17e88f185 service nova] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Refreshing instance network info cache due to event network-changed-d120af2e-395f-461e-9161-49f964e75549. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1286.033988] env[62277]: DEBUG oslo_concurrency.lockutils [req-ec3e4b29-ec5b-4391-82b7-ce3a34ac8e28 req-b48ed4a9-64eb-4327-974a-2af17e88f185 service nova] Acquiring lock "refresh_cache-21bd4623-2b46-43c4-859f-c4d3bf261e1f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1286.034128] env[62277]: DEBUG oslo_concurrency.lockutils [req-ec3e4b29-ec5b-4391-82b7-ce3a34ac8e28 req-b48ed4a9-64eb-4327-974a-2af17e88f185 service nova] Acquired lock "refresh_cache-21bd4623-2b46-43c4-859f-c4d3bf261e1f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1286.034304] env[62277]: DEBUG nova.network.neutron [req-ec3e4b29-ec5b-4391-82b7-ce3a34ac8e28 req-b48ed4a9-64eb-4327-974a-2af17e88f185 service nova] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Refreshing network info cache for port d120af2e-395f-461e-9161-49f964e75549 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1286.835716] env[62277]: DEBUG nova.network.neutron [req-ec3e4b29-ec5b-4391-82b7-ce3a34ac8e28 req-b48ed4a9-64eb-4327-974a-2af17e88f185 service nova] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Updated VIF entry in instance network info cache for port d120af2e-395f-461e-9161-49f964e75549. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1286.836076] env[62277]: DEBUG nova.network.neutron [req-ec3e4b29-ec5b-4391-82b7-ce3a34ac8e28 req-b48ed4a9-64eb-4327-974a-2af17e88f185 service nova] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Updating instance_info_cache with network_info: [{"id": "d120af2e-395f-461e-9161-49f964e75549", "address": "fa:16:3e:40:58:90", "network": {"id": "2c64e00f-4995-4827-b685-5096b1d1d064", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.178", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "63ed6630e9c140baa826f53d7a0564d1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd120af2e-39", "ovs_interfaceid": "d120af2e-395f-461e-9161-49f964e75549", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1286.848038] env[62277]: DEBUG oslo_concurrency.lockutils [req-ec3e4b29-ec5b-4391-82b7-ce3a34ac8e28 req-b48ed4a9-64eb-4327-974a-2af17e88f185 service nova] Releasing lock "refresh_cache-21bd4623-2b46-43c4-859f-c4d3bf261e1f" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1289.394365] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Acquiring lock "900160c8-a715-45a4-8709-b314fc3216d5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1289.394365] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Lock "900160c8-a715-45a4-8709-b314fc3216d5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1311.082447] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2b0870bb-0544-48f2-bcd2-316ff7a5acd3 tempest-AttachInterfacesV270Test-44240375 tempest-AttachInterfacesV270Test-44240375-project-member] Acquiring lock "0901d48b-bc88-461b-8503-eb6b51c39148" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1311.082819] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2b0870bb-0544-48f2-bcd2-316ff7a5acd3 tempest-AttachInterfacesV270Test-44240375 tempest-AttachInterfacesV270Test-44240375-project-member] Lock "0901d48b-bc88-461b-8503-eb6b51c39148" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1327.598611] env[62277]: WARNING oslo_vmware.rw_handles [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1327.598611] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1327.598611] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1327.598611] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1327.598611] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1327.598611] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1327.598611] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1327.598611] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1327.598611] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1327.598611] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1327.598611] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1327.598611] env[62277]: ERROR oslo_vmware.rw_handles [ 1327.599413] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/85b4e6f8-64d9-4e6a-b90c-828384a37a3c/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1327.600881] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1327.601149] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Copying Virtual Disk [datastore2] vmware_temp/85b4e6f8-64d9-4e6a-b90c-828384a37a3c/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/85b4e6f8-64d9-4e6a-b90c-828384a37a3c/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1327.601444] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3bb37fd4-8bf7-45c6-8aee-d8d7e5d341cb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1327.610689] env[62277]: DEBUG oslo_vmware.api [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Waiting for the task: (returnval){ [ 1327.610689] env[62277]: value = "task-1405382" [ 1327.610689] env[62277]: _type = "Task" [ 1327.610689] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1327.619653] env[62277]: DEBUG oslo_vmware.api [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Task: {'id': task-1405382, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1328.121330] env[62277]: DEBUG oslo_vmware.exceptions [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1328.121613] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1328.122164] env[62277]: ERROR nova.compute.manager [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1328.122164] env[62277]: Faults: ['InvalidArgument'] [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Traceback (most recent call last): [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] yield resources [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] self.driver.spawn(context, instance, image_meta, [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] self._fetch_image_if_missing(context, vi) [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] image_cache(vi, tmp_image_ds_loc) [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] vm_util.copy_virtual_disk( [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] session._wait_for_task(vmdk_copy_task) [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] return self.wait_for_task(task_ref) [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] return evt.wait() [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] result = hub.switch() [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] return self.greenlet.switch() [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] self.f(*self.args, **self.kw) [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] raise exceptions.translate_fault(task_info.error) [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Faults: ['InvalidArgument'] [ 1328.122164] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] [ 1328.123512] env[62277]: INFO nova.compute.manager [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Terminating instance [ 1328.124204] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1328.124437] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1328.124665] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fa956574-f2e7-45ba-965f-6a024896a122 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.126875] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Acquiring lock "refresh_cache-68925f1b-da69-4955-acb1-d6500b03daee" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1328.127047] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Acquired lock "refresh_cache-68925f1b-da69-4955-acb1-d6500b03daee" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1328.127216] env[62277]: DEBUG nova.network.neutron [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1328.133996] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1328.134182] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1328.134905] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-200833f6-bc84-42fd-bd16-f5e819603010 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.142379] env[62277]: DEBUG oslo_vmware.api [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Waiting for the task: (returnval){ [ 1328.142379] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]529a8cad-f40b-a60d-dd45-daae4dee056a" [ 1328.142379] env[62277]: _type = "Task" [ 1328.142379] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1328.150827] env[62277]: DEBUG oslo_vmware.api [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]529a8cad-f40b-a60d-dd45-daae4dee056a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1328.157190] env[62277]: DEBUG nova.network.neutron [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1328.218579] env[62277]: DEBUG nova.network.neutron [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1328.230184] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Releasing lock "refresh_cache-68925f1b-da69-4955-acb1-d6500b03daee" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1328.230614] env[62277]: DEBUG nova.compute.manager [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1328.230805] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1328.231890] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1be1547-2acf-4900-bec0-4efbbbc1cf4a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.239659] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1328.239885] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-447dc355-4a9b-4084-bf9d-236e6a43579b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.274519] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1328.274765] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1328.274940] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Deleting the datastore file [datastore2] 68925f1b-da69-4955-acb1-d6500b03daee {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1328.275219] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6d0780c3-b1d7-4a3c-8183-369c18a0df49 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.282068] env[62277]: DEBUG oslo_vmware.api [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Waiting for the task: (returnval){ [ 1328.282068] env[62277]: value = "task-1405384" [ 1328.282068] env[62277]: _type = "Task" [ 1328.282068] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1328.289762] env[62277]: DEBUG oslo_vmware.api [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Task: {'id': task-1405384, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1328.656009] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1328.656284] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Creating directory with path [datastore2] vmware_temp/f17dde81-ffb5-46b0-b23f-eeac5e5c5094/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1328.656974] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b1dd3525-6147-4bb9-a119-b8506fbd03a5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.658967] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cef0743f-f806-4715-8c0f-19c18a124efd tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquiring lock "21bd4623-2b46-43c4-859f-c4d3bf261e1f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1328.668340] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Created directory with path [datastore2] vmware_temp/f17dde81-ffb5-46b0-b23f-eeac5e5c5094/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1328.668529] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Fetch image to [datastore2] vmware_temp/f17dde81-ffb5-46b0-b23f-eeac5e5c5094/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1328.668705] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/f17dde81-ffb5-46b0-b23f-eeac5e5c5094/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1328.669877] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-725a200c-437c-491e-8917-5f678b942c3c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.677032] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cc1cc27-7531-47f2-a692-18cf371b13d7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.686861] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-146f7bfc-448a-445a-940f-d3a3491ea7d2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.717259] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71fe53b2-b192-439f-913d-36c3b980275d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.722461] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1489392c-802d-4419-b95a-d61cfe773e3f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1328.741832] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1328.791751] env[62277]: DEBUG oslo_vmware.api [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Task: {'id': task-1405384, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.031967} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1328.792616] env[62277]: DEBUG oslo_vmware.rw_handles [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f17dde81-ffb5-46b0-b23f-eeac5e5c5094/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1328.794036] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1328.794263] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1328.794438] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1328.795042] env[62277]: INFO nova.compute.manager [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Took 0.56 seconds to destroy the instance on the hypervisor. [ 1328.795042] env[62277]: DEBUG oslo.service.loopingcall [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1328.848343] env[62277]: DEBUG nova.compute.manager [-] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Skipping network deallocation for instance since networking was not requested. {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1328.853052] env[62277]: DEBUG oslo_vmware.rw_handles [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1328.853161] env[62277]: DEBUG oslo_vmware.rw_handles [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f17dde81-ffb5-46b0-b23f-eeac5e5c5094/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1328.853836] env[62277]: DEBUG nova.compute.claims [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1328.854108] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1328.854444] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1329.181176] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb1fbe47-e805-427b-a0d8-a47512b1dd60 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1329.188496] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7248c83e-b29a-42b1-97f6-1a1a94f0f695 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1329.218373] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4c901cf-d0c6-4c4f-8480-8d20535db82f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1329.224881] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c253577c-1194-4ffa-9ee5-cf36a7c6db0f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1329.238505] env[62277]: DEBUG nova.compute.provider_tree [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1329.246962] env[62277]: DEBUG nova.scheduler.client.report [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1329.265752] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.411s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1329.266161] env[62277]: ERROR nova.compute.manager [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1329.266161] env[62277]: Faults: ['InvalidArgument'] [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Traceback (most recent call last): [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] self.driver.spawn(context, instance, image_meta, [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] self._fetch_image_if_missing(context, vi) [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] image_cache(vi, tmp_image_ds_loc) [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] vm_util.copy_virtual_disk( [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] session._wait_for_task(vmdk_copy_task) [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] return self.wait_for_task(task_ref) [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] return evt.wait() [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] result = hub.switch() [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] return self.greenlet.switch() [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] self.f(*self.args, **self.kw) [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] raise exceptions.translate_fault(task_info.error) [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Faults: ['InvalidArgument'] [ 1329.266161] env[62277]: ERROR nova.compute.manager [instance: 68925f1b-da69-4955-acb1-d6500b03daee] [ 1329.267404] env[62277]: DEBUG nova.compute.utils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1329.269612] env[62277]: DEBUG nova.compute.manager [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Build of instance 68925f1b-da69-4955-acb1-d6500b03daee was re-scheduled: A specified parameter was not correct: fileType [ 1329.269612] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1329.270011] env[62277]: DEBUG nova.compute.manager [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1329.270252] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Acquiring lock "refresh_cache-68925f1b-da69-4955-acb1-d6500b03daee" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1329.270400] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Acquired lock "refresh_cache-68925f1b-da69-4955-acb1-d6500b03daee" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1329.270563] env[62277]: DEBUG nova.network.neutron [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1329.302743] env[62277]: DEBUG nova.network.neutron [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1329.385126] env[62277]: DEBUG nova.network.neutron [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1329.397136] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Releasing lock "refresh_cache-68925f1b-da69-4955-acb1-d6500b03daee" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1329.397474] env[62277]: DEBUG nova.compute.manager [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1329.397654] env[62277]: DEBUG nova.compute.manager [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Skipping network deallocation for instance since networking was not requested. {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1329.491843] env[62277]: INFO nova.scheduler.client.report [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Deleted allocations for instance 68925f1b-da69-4955-acb1-d6500b03daee [ 1329.513333] env[62277]: DEBUG oslo_concurrency.lockutils [None req-777072b0-d6ce-424e-a42c-821c1c07c45a tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Lock "68925f1b-da69-4955-acb1-d6500b03daee" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 327.023s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1329.514997] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e6449e7e-5171-4d0c-87cc-f1356b66e477 tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Lock "68925f1b-da69-4955-acb1-d6500b03daee" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 128.204s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1329.515235] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e6449e7e-5171-4d0c-87cc-f1356b66e477 tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Acquiring lock "68925f1b-da69-4955-acb1-d6500b03daee-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1329.515450] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e6449e7e-5171-4d0c-87cc-f1356b66e477 tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Lock "68925f1b-da69-4955-acb1-d6500b03daee-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1329.516382] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e6449e7e-5171-4d0c-87cc-f1356b66e477 tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Lock "68925f1b-da69-4955-acb1-d6500b03daee-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1329.517826] env[62277]: INFO nova.compute.manager [None req-e6449e7e-5171-4d0c-87cc-f1356b66e477 tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Terminating instance [ 1329.519365] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e6449e7e-5171-4d0c-87cc-f1356b66e477 tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Acquiring lock "refresh_cache-68925f1b-da69-4955-acb1-d6500b03daee" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1329.519518] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e6449e7e-5171-4d0c-87cc-f1356b66e477 tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Acquired lock "refresh_cache-68925f1b-da69-4955-acb1-d6500b03daee" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1329.519681] env[62277]: DEBUG nova.network.neutron [None req-e6449e7e-5171-4d0c-87cc-f1356b66e477 tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1329.533797] env[62277]: DEBUG nova.compute.manager [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1329.547065] env[62277]: DEBUG nova.network.neutron [None req-e6449e7e-5171-4d0c-87cc-f1356b66e477 tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1329.577811] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1329.578067] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1329.579571] env[62277]: INFO nova.compute.claims [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1329.852542] env[62277]: DEBUG nova.network.neutron [None req-e6449e7e-5171-4d0c-87cc-f1356b66e477 tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1329.866435] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e6449e7e-5171-4d0c-87cc-f1356b66e477 tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Releasing lock "refresh_cache-68925f1b-da69-4955-acb1-d6500b03daee" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1329.866435] env[62277]: DEBUG nova.compute.manager [None req-e6449e7e-5171-4d0c-87cc-f1356b66e477 tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1329.866435] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-e6449e7e-5171-4d0c-87cc-f1356b66e477 tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1329.866435] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-56e96f09-732e-4d43-a1bc-a3652fdd8d18 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1329.879010] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d29a50ef-8433-4953-ad9a-b2f741979b22 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1329.910083] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-e6449e7e-5171-4d0c-87cc-f1356b66e477 tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 68925f1b-da69-4955-acb1-d6500b03daee could not be found. [ 1329.910309] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-e6449e7e-5171-4d0c-87cc-f1356b66e477 tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1329.910486] env[62277]: INFO nova.compute.manager [None req-e6449e7e-5171-4d0c-87cc-f1356b66e477 tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1329.910732] env[62277]: DEBUG oslo.service.loopingcall [None req-e6449e7e-5171-4d0c-87cc-f1356b66e477 tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1329.913246] env[62277]: DEBUG nova.compute.manager [-] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1329.913352] env[62277]: DEBUG nova.network.neutron [-] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1329.938094] env[62277]: DEBUG nova.network.neutron [-] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1329.945999] env[62277]: DEBUG nova.network.neutron [-] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1329.956439] env[62277]: INFO nova.compute.manager [-] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] Took 0.04 seconds to deallocate network for instance. [ 1329.999819] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5af16255-9c54-4c74-86cb-5dd859011432 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1330.009372] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2013e349-7cd8-4878-b318-4af01ae6b90c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1330.039513] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36bed2b3-295c-4f90-a8fe-de874c326fb7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1330.047500] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bddbd28-5181-44ea-8f13-3e79d8009615 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1330.055190] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e6449e7e-5171-4d0c-87cc-f1356b66e477 tempest-ServerDiagnosticsV248Test-1615622326 tempest-ServerDiagnosticsV248Test-1615622326-project-member] Lock "68925f1b-da69-4955-acb1-d6500b03daee" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.540s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1330.056129] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "68925f1b-da69-4955-acb1-d6500b03daee" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 81.919s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1330.059323] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 68925f1b-da69-4955-acb1-d6500b03daee] During sync_power_state the instance has a pending task (deleting). Skip. [ 1330.059323] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "68925f1b-da69-4955-acb1-d6500b03daee" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1330.064959] env[62277]: DEBUG nova.compute.provider_tree [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1330.073624] env[62277]: DEBUG nova.scheduler.client.report [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1330.086986] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.509s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1330.087467] env[62277]: DEBUG nova.compute.manager [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1330.123018] env[62277]: DEBUG nova.compute.utils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1330.124195] env[62277]: DEBUG nova.compute.manager [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1330.124388] env[62277]: DEBUG nova.network.neutron [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1330.134045] env[62277]: DEBUG nova.compute.manager [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1330.198326] env[62277]: DEBUG nova.compute.manager [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1330.201614] env[62277]: DEBUG nova.policy [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3a834d1a58b94907bc6944154314dce9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '24482eabb41e4102a26c9e7576a49c33', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1330.227139] env[62277]: DEBUG nova.virt.hardware [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1330.227392] env[62277]: DEBUG nova.virt.hardware [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1330.227536] env[62277]: DEBUG nova.virt.hardware [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1330.227710] env[62277]: DEBUG nova.virt.hardware [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1330.227851] env[62277]: DEBUG nova.virt.hardware [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1330.228072] env[62277]: DEBUG nova.virt.hardware [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1330.228303] env[62277]: DEBUG nova.virt.hardware [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1330.228461] env[62277]: DEBUG nova.virt.hardware [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1330.228621] env[62277]: DEBUG nova.virt.hardware [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1330.228775] env[62277]: DEBUG nova.virt.hardware [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1330.228943] env[62277]: DEBUG nova.virt.hardware [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1330.229833] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae63ef8c-0ca6-4fac-a70c-9aebf0001001 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1330.239209] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8acc4949-d68d-497a-86c6-1c01ff6a244c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1330.581523] env[62277]: DEBUG nova.network.neutron [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Successfully created port: e78bfcaf-b01c-45be-9a23-1fdda831963b {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1331.495175] env[62277]: DEBUG nova.network.neutron [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Successfully updated port: e78bfcaf-b01c-45be-9a23-1fdda831963b {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1331.511472] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "refresh_cache-6d759045-e1fc-43ea-a882-1ead769b6d29" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1331.511627] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquired lock "refresh_cache-6d759045-e1fc-43ea-a882-1ead769b6d29" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1331.511802] env[62277]: DEBUG nova.network.neutron [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1331.554365] env[62277]: DEBUG nova.network.neutron [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1331.596110] env[62277]: DEBUG nova.compute.manager [req-9a05d717-8998-4881-94c5-7d2e0e26c77a req-b64f72a7-eefd-473c-85bf-a9e2dd33426f service nova] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Received event network-vif-plugged-e78bfcaf-b01c-45be-9a23-1fdda831963b {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1331.596110] env[62277]: DEBUG oslo_concurrency.lockutils [req-9a05d717-8998-4881-94c5-7d2e0e26c77a req-b64f72a7-eefd-473c-85bf-a9e2dd33426f service nova] Acquiring lock "6d759045-e1fc-43ea-a882-1ead769b6d29-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1331.596110] env[62277]: DEBUG oslo_concurrency.lockutils [req-9a05d717-8998-4881-94c5-7d2e0e26c77a req-b64f72a7-eefd-473c-85bf-a9e2dd33426f service nova] Lock "6d759045-e1fc-43ea-a882-1ead769b6d29-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1331.596110] env[62277]: DEBUG oslo_concurrency.lockutils [req-9a05d717-8998-4881-94c5-7d2e0e26c77a req-b64f72a7-eefd-473c-85bf-a9e2dd33426f service nova] Lock "6d759045-e1fc-43ea-a882-1ead769b6d29-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1331.596110] env[62277]: DEBUG nova.compute.manager [req-9a05d717-8998-4881-94c5-7d2e0e26c77a req-b64f72a7-eefd-473c-85bf-a9e2dd33426f service nova] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] No waiting events found dispatching network-vif-plugged-e78bfcaf-b01c-45be-9a23-1fdda831963b {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1331.596110] env[62277]: WARNING nova.compute.manager [req-9a05d717-8998-4881-94c5-7d2e0e26c77a req-b64f72a7-eefd-473c-85bf-a9e2dd33426f service nova] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Received unexpected event network-vif-plugged-e78bfcaf-b01c-45be-9a23-1fdda831963b for instance with vm_state building and task_state spawning. [ 1331.800906] env[62277]: DEBUG nova.network.neutron [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Updating instance_info_cache with network_info: [{"id": "e78bfcaf-b01c-45be-9a23-1fdda831963b", "address": "fa:16:3e:6a:fc:68", "network": {"id": "83a53f5b-0798-4c93-9294-0cdb526dc3ca", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1943573639-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "24482eabb41e4102a26c9e7576a49c33", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f85835c8-5d0c-4b2f-97c4-6c4006580f79", "external-id": "nsx-vlan-transportzone-245", "segmentation_id": 245, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape78bfcaf-b0", "ovs_interfaceid": "e78bfcaf-b01c-45be-9a23-1fdda831963b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1331.818148] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Releasing lock "refresh_cache-6d759045-e1fc-43ea-a882-1ead769b6d29" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1331.818411] env[62277]: DEBUG nova.compute.manager [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Instance network_info: |[{"id": "e78bfcaf-b01c-45be-9a23-1fdda831963b", "address": "fa:16:3e:6a:fc:68", "network": {"id": "83a53f5b-0798-4c93-9294-0cdb526dc3ca", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1943573639-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "24482eabb41e4102a26c9e7576a49c33", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f85835c8-5d0c-4b2f-97c4-6c4006580f79", "external-id": "nsx-vlan-transportzone-245", "segmentation_id": 245, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape78bfcaf-b0", "ovs_interfaceid": "e78bfcaf-b01c-45be-9a23-1fdda831963b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1331.818818] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6a:fc:68', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f85835c8-5d0c-4b2f-97c4-6c4006580f79', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e78bfcaf-b01c-45be-9a23-1fdda831963b', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1331.826419] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Creating folder: Project (24482eabb41e4102a26c9e7576a49c33). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1331.827050] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d7d8b2c8-3cd3-4343-b2d5-dc651bd90818 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1331.838381] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Created folder: Project (24482eabb41e4102a26c9e7576a49c33) in parent group-v297781. [ 1331.838540] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Creating folder: Instances. Parent ref: group-v297838. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1331.838770] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3c7bcced-bb7b-4ab6-85e1-f33e60ec2532 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1331.848621] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Created folder: Instances in parent group-v297838. [ 1331.848621] env[62277]: DEBUG oslo.service.loopingcall [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1331.848621] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1331.848621] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d90f8d0e-b8f5-4cb7-828c-f78b674dc0d4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1331.866978] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1331.866978] env[62277]: value = "task-1405387" [ 1331.866978] env[62277]: _type = "Task" [ 1331.866978] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1331.874233] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405387, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1332.378329] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405387, 'name': CreateVM_Task, 'duration_secs': 0.465567} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1332.378329] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1332.378801] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1332.378970] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1332.379314] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1332.379695] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e60b117f-0b02-4e10-9065-059305e82521 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.384020] env[62277]: DEBUG oslo_vmware.api [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Waiting for the task: (returnval){ [ 1332.384020] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]521c2095-b9ee-1227-e222-3b3da736a751" [ 1332.384020] env[62277]: _type = "Task" [ 1332.384020] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1332.391946] env[62277]: DEBUG oslo_vmware.api [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]521c2095-b9ee-1227-e222-3b3da736a751, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1332.895484] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1332.895790] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1332.896019] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1333.748115] env[62277]: DEBUG nova.compute.manager [req-e11453e6-5f25-4582-b8c2-76e0656bfe27 req-ef39a6fa-1fba-471a-b62a-60095b249dab service nova] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Received event network-changed-e78bfcaf-b01c-45be-9a23-1fdda831963b {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1333.748386] env[62277]: DEBUG nova.compute.manager [req-e11453e6-5f25-4582-b8c2-76e0656bfe27 req-ef39a6fa-1fba-471a-b62a-60095b249dab service nova] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Refreshing instance network info cache due to event network-changed-e78bfcaf-b01c-45be-9a23-1fdda831963b. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1333.748603] env[62277]: DEBUG oslo_concurrency.lockutils [req-e11453e6-5f25-4582-b8c2-76e0656bfe27 req-ef39a6fa-1fba-471a-b62a-60095b249dab service nova] Acquiring lock "refresh_cache-6d759045-e1fc-43ea-a882-1ead769b6d29" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1333.748745] env[62277]: DEBUG oslo_concurrency.lockutils [req-e11453e6-5f25-4582-b8c2-76e0656bfe27 req-ef39a6fa-1fba-471a-b62a-60095b249dab service nova] Acquired lock "refresh_cache-6d759045-e1fc-43ea-a882-1ead769b6d29" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1333.748898] env[62277]: DEBUG nova.network.neutron [req-e11453e6-5f25-4582-b8c2-76e0656bfe27 req-ef39a6fa-1fba-471a-b62a-60095b249dab service nova] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Refreshing network info cache for port e78bfcaf-b01c-45be-9a23-1fdda831963b {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1334.364263] env[62277]: DEBUG nova.network.neutron [req-e11453e6-5f25-4582-b8c2-76e0656bfe27 req-ef39a6fa-1fba-471a-b62a-60095b249dab service nova] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Updated VIF entry in instance network info cache for port e78bfcaf-b01c-45be-9a23-1fdda831963b. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1334.364752] env[62277]: DEBUG nova.network.neutron [req-e11453e6-5f25-4582-b8c2-76e0656bfe27 req-ef39a6fa-1fba-471a-b62a-60095b249dab service nova] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Updating instance_info_cache with network_info: [{"id": "e78bfcaf-b01c-45be-9a23-1fdda831963b", "address": "fa:16:3e:6a:fc:68", "network": {"id": "83a53f5b-0798-4c93-9294-0cdb526dc3ca", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1943573639-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "24482eabb41e4102a26c9e7576a49c33", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f85835c8-5d0c-4b2f-97c4-6c4006580f79", "external-id": "nsx-vlan-transportzone-245", "segmentation_id": 245, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape78bfcaf-b0", "ovs_interfaceid": "e78bfcaf-b01c-45be-9a23-1fdda831963b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1334.378271] env[62277]: DEBUG oslo_concurrency.lockutils [req-e11453e6-5f25-4582-b8c2-76e0656bfe27 req-ef39a6fa-1fba-471a-b62a-60095b249dab service nova] Releasing lock "refresh_cache-6d759045-e1fc-43ea-a882-1ead769b6d29" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1337.069850] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Acquiring lock "63267d5c-d004-41c1-866a-75b9e37521b7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1337.070290] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Lock "63267d5c-d004-41c1-866a-75b9e37521b7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1338.168360] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1338.168628] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 1338.168661] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 1338.191494] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1338.191701] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1338.191851] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1338.191978] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1338.192115] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1338.192239] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1338.192358] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1338.192519] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1338.192662] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1338.192783] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1338.192903] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 1338.193414] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1338.193591] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1340.189421] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1341.169809] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1341.169809] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1341.169809] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1341.184058] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1341.184319] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1341.184473] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1341.184651] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1341.185791] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-997326e1-9821-4f4d-9cbc-5f5d1e913675 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1341.194411] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6279cb6-4379-497b-83ad-47a2257efd0d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1341.209618] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a5022c9-29d2-4bc3-a3eb-681916160797 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1341.216050] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8784d5e3-af55-478f-b8ce-55ee727c238c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1341.245259] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181424MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1341.245343] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1341.245524] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1341.319783] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 36ff1435-1999-4e95-8920-81a1b25cc452 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1341.319947] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 866c4415-caab-4d81-86ba-ed662feb3c4f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1341.320087] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 350e2302-66b9-4dd6-b0f4-77000992408b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1341.320211] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 23bc5a48-3e96-4897-bf28-ad14a0bdde62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1341.320329] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 346748bd-b4e8-4e93-b71d-66c90a45e372 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1341.320479] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 930ff058-ab48-4c8a-8f5e-4820a3b12d50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1341.320554] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1341.320668] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 1e8429b2-7149-4832-8590-e0ebd8501176 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1341.320780] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 21bd4623-2b46-43c4-859f-c4d3bf261e1f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1341.320892] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6d759045-e1fc-43ea-a882-1ead769b6d29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1341.334404] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 32bed248-06d5-47a1-b281-47921d99dbf6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1341.345720] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance e573e784-3318-4a41-89fd-40cbe8749413 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1341.355662] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 06571cd1-61a1-48e9-a204-624d5f383ad3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1341.366912] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8d00162c-7379-48b6-841b-f802db2582db has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1341.378703] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c8d02374-bed2-4b4a-9bab-3a3dec87ad3e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1341.388442] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance df611bf9-45db-4940-a59e-fccc7d96b935 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1341.398211] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 5d595f5e-6d35-4c89-a4e2-a3639c6145c8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1341.407536] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4fd54f91-dedd-4ce2-8acf-8a2123be73b8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1341.416373] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance baabe4ee-b366-45a8-bf06-cd63f697e7dc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1341.424988] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 5cf06245-3fa1-4596-8260-7a82bc4a1193 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1341.433907] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 308e5c48-c452-4dbf-94b0-1eb12951e620 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1341.442418] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 39172747-1245-473d-9f18-87bae208b5b1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1341.451232] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 7ecfbbee-4955-4704-af62-ce8f5470cfbe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1341.462073] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4ca037fc-9a4e-413b-9b4e-2122f5a4fe18 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1341.471144] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 900160c8-a715-45a4-8709-b314fc3216d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1341.480960] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 0901d48b-bc88-461b-8503-eb6b51c39148 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1341.489522] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 63267d5c-d004-41c1-866a-75b9e37521b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1341.489751] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1341.489898] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1341.774711] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1180fc67-1e54-43cd-a148-aefd7b431651 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1341.783024] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-feae9396-e775-41c1-8b68-ea3fbf7e4e11 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1341.811299] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-393d8b1f-c3aa-43f5-abd6-0d837d027763 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1341.819164] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc212e3e-9952-44d2-8f3f-d553158cbf14 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1341.831876] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1341.840167] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1341.855806] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1341.855806] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.610s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1342.855752] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1343.164337] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1346.167982] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1346.168322] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 1378.380299] env[62277]: WARNING oslo_vmware.rw_handles [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1378.380299] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1378.380299] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1378.380299] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1378.380299] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1378.380299] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1378.380299] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1378.380299] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1378.380299] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1378.380299] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1378.380299] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1378.380299] env[62277]: ERROR oslo_vmware.rw_handles [ 1378.381145] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/f17dde81-ffb5-46b0-b23f-eeac5e5c5094/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1378.382848] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1378.383043] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Copying Virtual Disk [datastore2] vmware_temp/f17dde81-ffb5-46b0-b23f-eeac5e5c5094/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/f17dde81-ffb5-46b0-b23f-eeac5e5c5094/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1378.383382] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-fbb009e8-aaf9-40c9-9b88-1f4b4f8afbdc {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1378.393085] env[62277]: DEBUG oslo_vmware.api [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Waiting for the task: (returnval){ [ 1378.393085] env[62277]: value = "task-1405388" [ 1378.393085] env[62277]: _type = "Task" [ 1378.393085] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1378.402055] env[62277]: DEBUG oslo_vmware.api [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Task: {'id': task-1405388, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1378.905968] env[62277]: DEBUG oslo_vmware.exceptions [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1378.906286] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1378.906836] env[62277]: ERROR nova.compute.manager [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1378.906836] env[62277]: Faults: ['InvalidArgument'] [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Traceback (most recent call last): [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] yield resources [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] self.driver.spawn(context, instance, image_meta, [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] self._fetch_image_if_missing(context, vi) [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] image_cache(vi, tmp_image_ds_loc) [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] vm_util.copy_virtual_disk( [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] session._wait_for_task(vmdk_copy_task) [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] return self.wait_for_task(task_ref) [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] return evt.wait() [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] result = hub.switch() [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] return self.greenlet.switch() [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] self.f(*self.args, **self.kw) [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] raise exceptions.translate_fault(task_info.error) [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Faults: ['InvalidArgument'] [ 1378.906836] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] [ 1378.907756] env[62277]: INFO nova.compute.manager [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Terminating instance [ 1378.908776] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1378.908975] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1378.909230] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4a173b7a-a92e-431d-a59a-7cfbfd90d5c3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1378.912693] env[62277]: DEBUG nova.compute.manager [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1378.912909] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1378.913701] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc0d4850-80b6-45f4-9480-0a151fe72d80 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1378.917816] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1378.918029] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1378.920448] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-44b63ea4-fd62-4ccd-8d5f-15da60844d6c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1378.922745] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1378.923219] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-bce3cf90-9d24-4e2a-a7c8-9f37f8623187 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1378.928063] env[62277]: DEBUG oslo_vmware.api [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Waiting for the task: (returnval){ [ 1378.928063] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]529c2973-4168-e834-aa30-ca3579a18492" [ 1378.928063] env[62277]: _type = "Task" [ 1378.928063] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1378.934470] env[62277]: DEBUG oslo_vmware.api [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]529c2973-4168-e834-aa30-ca3579a18492, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1378.994759] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1378.995479] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1378.995637] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Deleting the datastore file [datastore2] 36ff1435-1999-4e95-8920-81a1b25cc452 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1378.995978] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-dbd5ee77-de70-4838-8e97-1764592374c4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.003252] env[62277]: DEBUG oslo_vmware.api [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Waiting for the task: (returnval){ [ 1379.003252] env[62277]: value = "task-1405390" [ 1379.003252] env[62277]: _type = "Task" [ 1379.003252] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1379.012041] env[62277]: DEBUG oslo_vmware.api [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Task: {'id': task-1405390, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1379.439034] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1379.439302] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Creating directory with path [datastore2] vmware_temp/00af3c49-1285-4aa2-b297-51bf8b2f0020/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1379.439412] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b1ecb4c2-e13b-4c9a-9217-6109781cb0b7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.452049] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Created directory with path [datastore2] vmware_temp/00af3c49-1285-4aa2-b297-51bf8b2f0020/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1379.452049] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Fetch image to [datastore2] vmware_temp/00af3c49-1285-4aa2-b297-51bf8b2f0020/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1379.452281] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/00af3c49-1285-4aa2-b297-51bf8b2f0020/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1379.453086] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83783bdb-3cdc-4b9f-9886-daf0ea8f484a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.460756] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a801e8d-8518-4640-943a-24e18cf5638a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.470528] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39e59516-2813-4875-9356-06b436273267 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.503273] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0024654-102d-4060-8ec7-1f806888dfe9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.514878] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cc987215-21ee-429c-b5d3-0af246cb7aeb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.516701] env[62277]: DEBUG oslo_vmware.api [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Task: {'id': task-1405390, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.083737} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1379.516931] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1379.517165] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1379.517332] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1379.517537] env[62277]: INFO nova.compute.manager [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1379.519668] env[62277]: DEBUG nova.compute.claims [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1379.519833] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1379.520058] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1379.539548] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1379.594263] env[62277]: DEBUG oslo_vmware.rw_handles [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/00af3c49-1285-4aa2-b297-51bf8b2f0020/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1379.654342] env[62277]: DEBUG oslo_vmware.rw_handles [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1379.654532] env[62277]: DEBUG oslo_vmware.rw_handles [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/00af3c49-1285-4aa2-b297-51bf8b2f0020/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1379.937147] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b5faad8-82f6-43bb-bbec-fc49ceced4fc {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.944934] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c21a3001-ec98-470d-8862-9a2777c5a77f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.973999] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1afe39a8-f92d-4256-9c07-ca1ddbbea2d3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.981038] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd5b90cb-2ed6-4ae1-a450-821c735e97a9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1379.993574] env[62277]: DEBUG nova.compute.provider_tree [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1380.002159] env[62277]: DEBUG nova.scheduler.client.report [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1380.016232] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.496s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1380.016742] env[62277]: ERROR nova.compute.manager [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1380.016742] env[62277]: Faults: ['InvalidArgument'] [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Traceback (most recent call last): [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] self.driver.spawn(context, instance, image_meta, [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] self._fetch_image_if_missing(context, vi) [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] image_cache(vi, tmp_image_ds_loc) [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] vm_util.copy_virtual_disk( [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] session._wait_for_task(vmdk_copy_task) [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] return self.wait_for_task(task_ref) [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] return evt.wait() [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] result = hub.switch() [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] return self.greenlet.switch() [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] self.f(*self.args, **self.kw) [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] raise exceptions.translate_fault(task_info.error) [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Faults: ['InvalidArgument'] [ 1380.016742] env[62277]: ERROR nova.compute.manager [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] [ 1380.017845] env[62277]: DEBUG nova.compute.utils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1380.018805] env[62277]: DEBUG nova.compute.manager [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Build of instance 36ff1435-1999-4e95-8920-81a1b25cc452 was re-scheduled: A specified parameter was not correct: fileType [ 1380.018805] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1380.019185] env[62277]: DEBUG nova.compute.manager [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1380.019355] env[62277]: DEBUG nova.compute.manager [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1380.019516] env[62277]: DEBUG nova.compute.manager [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1380.019668] env[62277]: DEBUG nova.network.neutron [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1380.399568] env[62277]: DEBUG nova.network.neutron [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1380.413446] env[62277]: INFO nova.compute.manager [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Took 0.39 seconds to deallocate network for instance. [ 1380.533985] env[62277]: INFO nova.scheduler.client.report [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Deleted allocations for instance 36ff1435-1999-4e95-8920-81a1b25cc452 [ 1380.553469] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4f6898e2-e855-4094-8b4f-934ce78b5df8 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Lock "36ff1435-1999-4e95-8920-81a1b25cc452" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 379.406s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1380.554589] env[62277]: DEBUG oslo_concurrency.lockutils [None req-22fb2479-820d-46db-8404-196b9598fe72 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Lock "36ff1435-1999-4e95-8920-81a1b25cc452" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 270.877s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1380.554797] env[62277]: DEBUG oslo_concurrency.lockutils [None req-22fb2479-820d-46db-8404-196b9598fe72 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Acquiring lock "36ff1435-1999-4e95-8920-81a1b25cc452-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1380.555220] env[62277]: DEBUG oslo_concurrency.lockutils [None req-22fb2479-820d-46db-8404-196b9598fe72 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Lock "36ff1435-1999-4e95-8920-81a1b25cc452-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1380.555317] env[62277]: DEBUG oslo_concurrency.lockutils [None req-22fb2479-820d-46db-8404-196b9598fe72 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Lock "36ff1435-1999-4e95-8920-81a1b25cc452-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1380.557276] env[62277]: INFO nova.compute.manager [None req-22fb2479-820d-46db-8404-196b9598fe72 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Terminating instance [ 1380.560797] env[62277]: DEBUG nova.compute.manager [None req-22fb2479-820d-46db-8404-196b9598fe72 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1380.560797] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-22fb2479-820d-46db-8404-196b9598fe72 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1380.561052] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8a915e92-a28e-40af-9445-260cd55792f6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1380.570336] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38a06f35-f7a3-43fe-9a18-c574b81f9659 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1380.580697] env[62277]: DEBUG nova.compute.manager [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1380.601117] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-22fb2479-820d-46db-8404-196b9598fe72 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 36ff1435-1999-4e95-8920-81a1b25cc452 could not be found. [ 1380.601380] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-22fb2479-820d-46db-8404-196b9598fe72 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1380.601558] env[62277]: INFO nova.compute.manager [None req-22fb2479-820d-46db-8404-196b9598fe72 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1380.601853] env[62277]: DEBUG oslo.service.loopingcall [None req-22fb2479-820d-46db-8404-196b9598fe72 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1380.602021] env[62277]: DEBUG nova.compute.manager [-] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1380.602123] env[62277]: DEBUG nova.network.neutron [-] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1380.634133] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1380.634384] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1380.636117] env[62277]: INFO nova.compute.claims [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1380.640031] env[62277]: DEBUG nova.network.neutron [-] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1380.656132] env[62277]: INFO nova.compute.manager [-] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] Took 0.05 seconds to deallocate network for instance. [ 1380.759140] env[62277]: DEBUG oslo_concurrency.lockutils [None req-22fb2479-820d-46db-8404-196b9598fe72 tempest-ServersAdminTestJSON-1397695677 tempest-ServersAdminTestJSON-1397695677-project-member] Lock "36ff1435-1999-4e95-8920-81a1b25cc452" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.204s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1380.759994] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "36ff1435-1999-4e95-8920-81a1b25cc452" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 132.623s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1380.760199] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 36ff1435-1999-4e95-8920-81a1b25cc452] During sync_power_state the instance has a pending task (deleting). Skip. [ 1380.760365] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "36ff1435-1999-4e95-8920-81a1b25cc452" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1380.999023] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0af0a9f-f28a-4ad9-a8db-5e5b2140b7a5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1381.006642] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a9a504f-953e-49e2-a3b1-d876746bacca {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1381.037938] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6147589a-7694-44d5-a231-eafac333b18c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1381.045042] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c0887cd-f026-4342-94de-5dac8cbaed47 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1381.057998] env[62277]: DEBUG nova.compute.provider_tree [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1381.067473] env[62277]: DEBUG nova.scheduler.client.report [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1381.083520] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.449s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1381.083993] env[62277]: DEBUG nova.compute.manager [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1381.117296] env[62277]: DEBUG nova.compute.utils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1381.119019] env[62277]: DEBUG nova.compute.manager [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1381.119195] env[62277]: DEBUG nova.network.neutron [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1381.128116] env[62277]: DEBUG nova.compute.manager [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1381.196209] env[62277]: DEBUG nova.compute.manager [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1381.200940] env[62277]: DEBUG nova.policy [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '17b9b0c1ba02478887c57c995373374d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3562addabe29452fbe00a6d53b316340', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1381.229676] env[62277]: DEBUG nova.virt.hardware [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1381.229676] env[62277]: DEBUG nova.virt.hardware [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1381.229834] env[62277]: DEBUG nova.virt.hardware [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1381.229972] env[62277]: DEBUG nova.virt.hardware [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1381.230125] env[62277]: DEBUG nova.virt.hardware [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1381.230268] env[62277]: DEBUG nova.virt.hardware [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1381.230465] env[62277]: DEBUG nova.virt.hardware [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1381.230614] env[62277]: DEBUG nova.virt.hardware [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1381.230772] env[62277]: DEBUG nova.virt.hardware [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1381.230931] env[62277]: DEBUG nova.virt.hardware [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1381.231217] env[62277]: DEBUG nova.virt.hardware [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1381.232107] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a3ab73c-bfec-4aca-8330-a768c833ffb6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1381.242422] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f84a8ccd-78f8-4b8d-ab61-3da25bbefe7a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1381.834312] env[62277]: DEBUG nova.network.neutron [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Successfully created port: 22e16106-2e17-48e1-a3df-13744f9ed1e0 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1382.503350] env[62277]: DEBUG nova.compute.manager [req-e62677bf-aa33-496c-8afe-2ed7e52b484a req-3348689d-8581-4f0a-aaf2-0c415a3e042a service nova] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Received event network-vif-plugged-22e16106-2e17-48e1-a3df-13744f9ed1e0 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1382.503542] env[62277]: DEBUG oslo_concurrency.lockutils [req-e62677bf-aa33-496c-8afe-2ed7e52b484a req-3348689d-8581-4f0a-aaf2-0c415a3e042a service nova] Acquiring lock "32bed248-06d5-47a1-b281-47921d99dbf6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1382.503743] env[62277]: DEBUG oslo_concurrency.lockutils [req-e62677bf-aa33-496c-8afe-2ed7e52b484a req-3348689d-8581-4f0a-aaf2-0c415a3e042a service nova] Lock "32bed248-06d5-47a1-b281-47921d99dbf6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1382.503915] env[62277]: DEBUG oslo_concurrency.lockutils [req-e62677bf-aa33-496c-8afe-2ed7e52b484a req-3348689d-8581-4f0a-aaf2-0c415a3e042a service nova] Lock "32bed248-06d5-47a1-b281-47921d99dbf6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1382.504212] env[62277]: DEBUG nova.compute.manager [req-e62677bf-aa33-496c-8afe-2ed7e52b484a req-3348689d-8581-4f0a-aaf2-0c415a3e042a service nova] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] No waiting events found dispatching network-vif-plugged-22e16106-2e17-48e1-a3df-13744f9ed1e0 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1382.504407] env[62277]: WARNING nova.compute.manager [req-e62677bf-aa33-496c-8afe-2ed7e52b484a req-3348689d-8581-4f0a-aaf2-0c415a3e042a service nova] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Received unexpected event network-vif-plugged-22e16106-2e17-48e1-a3df-13744f9ed1e0 for instance with vm_state building and task_state spawning. [ 1382.659225] env[62277]: DEBUG nova.network.neutron [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Successfully updated port: 22e16106-2e17-48e1-a3df-13744f9ed1e0 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1382.670858] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Acquiring lock "refresh_cache-32bed248-06d5-47a1-b281-47921d99dbf6" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1382.671015] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Acquired lock "refresh_cache-32bed248-06d5-47a1-b281-47921d99dbf6" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1382.671167] env[62277]: DEBUG nova.network.neutron [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1382.709415] env[62277]: DEBUG nova.network.neutron [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1382.934137] env[62277]: DEBUG nova.network.neutron [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Updating instance_info_cache with network_info: [{"id": "22e16106-2e17-48e1-a3df-13744f9ed1e0", "address": "fa:16:3e:2c:65:88", "network": {"id": "22a3312e-af99-4873-b83e-83852193ff8b", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2014430520-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3562addabe29452fbe00a6d53b316340", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7bcd9d2d-25c8-41ad-9a4a-93b9029ba993", "external-id": "nsx-vlan-transportzone-734", "segmentation_id": 734, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap22e16106-2e", "ovs_interfaceid": "22e16106-2e17-48e1-a3df-13744f9ed1e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1382.950099] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Releasing lock "refresh_cache-32bed248-06d5-47a1-b281-47921d99dbf6" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1382.950099] env[62277]: DEBUG nova.compute.manager [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Instance network_info: |[{"id": "22e16106-2e17-48e1-a3df-13744f9ed1e0", "address": "fa:16:3e:2c:65:88", "network": {"id": "22a3312e-af99-4873-b83e-83852193ff8b", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2014430520-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3562addabe29452fbe00a6d53b316340", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7bcd9d2d-25c8-41ad-9a4a-93b9029ba993", "external-id": "nsx-vlan-transportzone-734", "segmentation_id": 734, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap22e16106-2e", "ovs_interfaceid": "22e16106-2e17-48e1-a3df-13744f9ed1e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1382.950528] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2c:65:88', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7bcd9d2d-25c8-41ad-9a4a-93b9029ba993', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '22e16106-2e17-48e1-a3df-13744f9ed1e0', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1382.957961] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Creating folder: Project (3562addabe29452fbe00a6d53b316340). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1382.959030] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f54eb488-cd50-4494-a9d0-9aed28fed2e8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.969863] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Created folder: Project (3562addabe29452fbe00a6d53b316340) in parent group-v297781. [ 1382.970057] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Creating folder: Instances. Parent ref: group-v297841. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1382.970282] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7528429f-e194-4cac-b9eb-0a7f0f2fa095 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.979547] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Created folder: Instances in parent group-v297841. [ 1382.979717] env[62277]: DEBUG oslo.service.loopingcall [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1382.979886] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1382.980128] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c7c4a0a1-5eff-441d-b934-0273050ec03f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.999205] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1382.999205] env[62277]: value = "task-1405393" [ 1382.999205] env[62277]: _type = "Task" [ 1382.999205] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1383.006293] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405393, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1383.509281] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405393, 'name': CreateVM_Task, 'duration_secs': 0.274} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1383.509393] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1383.510900] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1383.510900] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1383.510900] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1383.510900] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-86dc5018-a24f-456f-ad0c-1fd5f55b9db8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.515323] env[62277]: DEBUG oslo_vmware.api [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Waiting for the task: (returnval){ [ 1383.515323] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52cc1e43-079c-3424-fcc0-cb8b31a56421" [ 1383.515323] env[62277]: _type = "Task" [ 1383.515323] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1383.526736] env[62277]: DEBUG oslo_vmware.api [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52cc1e43-079c-3424-fcc0-cb8b31a56421, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1384.025610] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1384.025944] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1384.026220] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1384.556125] env[62277]: DEBUG nova.compute.manager [req-fdf2f02f-2ba2-4b6f-98d0-05dcc5d0e25b req-fe8c93ae-a115-4d55-aa6e-87a390f35045 service nova] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Received event network-changed-22e16106-2e17-48e1-a3df-13744f9ed1e0 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1384.556354] env[62277]: DEBUG nova.compute.manager [req-fdf2f02f-2ba2-4b6f-98d0-05dcc5d0e25b req-fe8c93ae-a115-4d55-aa6e-87a390f35045 service nova] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Refreshing instance network info cache due to event network-changed-22e16106-2e17-48e1-a3df-13744f9ed1e0. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1384.556562] env[62277]: DEBUG oslo_concurrency.lockutils [req-fdf2f02f-2ba2-4b6f-98d0-05dcc5d0e25b req-fe8c93ae-a115-4d55-aa6e-87a390f35045 service nova] Acquiring lock "refresh_cache-32bed248-06d5-47a1-b281-47921d99dbf6" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1384.556701] env[62277]: DEBUG oslo_concurrency.lockutils [req-fdf2f02f-2ba2-4b6f-98d0-05dcc5d0e25b req-fe8c93ae-a115-4d55-aa6e-87a390f35045 service nova] Acquired lock "refresh_cache-32bed248-06d5-47a1-b281-47921d99dbf6" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1384.556856] env[62277]: DEBUG nova.network.neutron [req-fdf2f02f-2ba2-4b6f-98d0-05dcc5d0e25b req-fe8c93ae-a115-4d55-aa6e-87a390f35045 service nova] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Refreshing network info cache for port 22e16106-2e17-48e1-a3df-13744f9ed1e0 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1384.877190] env[62277]: DEBUG nova.network.neutron [req-fdf2f02f-2ba2-4b6f-98d0-05dcc5d0e25b req-fe8c93ae-a115-4d55-aa6e-87a390f35045 service nova] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Updated VIF entry in instance network info cache for port 22e16106-2e17-48e1-a3df-13744f9ed1e0. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1384.877598] env[62277]: DEBUG nova.network.neutron [req-fdf2f02f-2ba2-4b6f-98d0-05dcc5d0e25b req-fe8c93ae-a115-4d55-aa6e-87a390f35045 service nova] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Updating instance_info_cache with network_info: [{"id": "22e16106-2e17-48e1-a3df-13744f9ed1e0", "address": "fa:16:3e:2c:65:88", "network": {"id": "22a3312e-af99-4873-b83e-83852193ff8b", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-2014430520-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3562addabe29452fbe00a6d53b316340", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7bcd9d2d-25c8-41ad-9a4a-93b9029ba993", "external-id": "nsx-vlan-transportzone-734", "segmentation_id": 734, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap22e16106-2e", "ovs_interfaceid": "22e16106-2e17-48e1-a3df-13744f9ed1e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1384.886814] env[62277]: DEBUG oslo_concurrency.lockutils [req-fdf2f02f-2ba2-4b6f-98d0-05dcc5d0e25b req-fe8c93ae-a115-4d55-aa6e-87a390f35045 service nova] Releasing lock "refresh_cache-32bed248-06d5-47a1-b281-47921d99dbf6" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1386.835830] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ce806ea2-0ed1-4e6b-84a6-51b6227c5ad3 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "6d759045-e1fc-43ea-a882-1ead769b6d29" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1400.170632] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1400.170632] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 1400.170632] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 1400.194989] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1400.195197] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1400.195344] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1400.195486] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1400.195569] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1400.195689] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1400.195809] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1400.195927] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1400.196073] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1400.196199] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1400.196345] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 1400.196822] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1400.196994] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1401.168361] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1402.164248] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1402.167806] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1402.167999] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1402.179031] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1402.179134] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1402.179288] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1402.179439] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1402.180570] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf7d7c99-2bf4-4e44-abac-209c222ddc74 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1402.189475] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f8c2978-f394-49fc-ad05-b10a02048ad6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1402.203510] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0991cfb2-2d85-44f7-b9d2-7558b77f5da7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1402.209921] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b1f2cfa-ad94-4b80-a8ff-d9a5fa2d060b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1402.238505] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181450MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1402.238663] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1402.238850] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1402.332295] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 866c4415-caab-4d81-86ba-ed662feb3c4f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1402.332295] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 350e2302-66b9-4dd6-b0f4-77000992408b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1402.332295] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 23bc5a48-3e96-4897-bf28-ad14a0bdde62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1402.332295] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 346748bd-b4e8-4e93-b71d-66c90a45e372 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1402.332295] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 930ff058-ab48-4c8a-8f5e-4820a3b12d50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1402.332295] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1402.332295] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 1e8429b2-7149-4832-8590-e0ebd8501176 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1402.332295] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 21bd4623-2b46-43c4-859f-c4d3bf261e1f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1402.332295] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6d759045-e1fc-43ea-a882-1ead769b6d29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1402.332295] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 32bed248-06d5-47a1-b281-47921d99dbf6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1402.343661] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance e573e784-3318-4a41-89fd-40cbe8749413 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1402.356614] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 06571cd1-61a1-48e9-a204-624d5f383ad3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1402.367165] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8d00162c-7379-48b6-841b-f802db2582db has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1402.378757] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c8d02374-bed2-4b4a-9bab-3a3dec87ad3e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1402.388537] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance df611bf9-45db-4940-a59e-fccc7d96b935 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1402.398555] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 5d595f5e-6d35-4c89-a4e2-a3639c6145c8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1402.408271] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4fd54f91-dedd-4ce2-8acf-8a2123be73b8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1402.417767] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance baabe4ee-b366-45a8-bf06-cd63f697e7dc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1402.428463] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 5cf06245-3fa1-4596-8260-7a82bc4a1193 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1402.437883] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 308e5c48-c452-4dbf-94b0-1eb12951e620 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1402.448828] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 39172747-1245-473d-9f18-87bae208b5b1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1402.460421] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 7ecfbbee-4955-4704-af62-ce8f5470cfbe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1402.471317] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4ca037fc-9a4e-413b-9b4e-2122f5a4fe18 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1402.481655] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 900160c8-a715-45a4-8709-b314fc3216d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1402.497596] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 0901d48b-bc88-461b-8503-eb6b51c39148 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1402.506970] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 63267d5c-d004-41c1-866a-75b9e37521b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1402.508180] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1402.508180] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1402.804993] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-809653b7-f567-49b1-83c1-27b9aaa6467a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1402.812665] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34b38af1-7278-4c44-a78a-129eae832d95 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1402.841342] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14bc79b4-1d4d-445e-8b22-7492fe9457d7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1402.847962] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-587290e7-dfbf-43e9-a841-1a0acbd2e13e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1402.860614] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1402.869312] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1402.882563] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1402.882740] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.644s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1403.883424] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1406.175756] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Acquiring lock "13959890-87a1-45ba-98de-621373e265e7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1406.176350] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Lock "13959890-87a1-45ba-98de-621373e265e7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1407.168594] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1407.168773] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 1412.021304] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4d5d312e-2a71-43df-bdb5-922651d130c8 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "b2ad5654-28f5-43b2-acb8-a7eb01f70b55" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1412.021304] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4d5d312e-2a71-43df-bdb5-922651d130c8 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "b2ad5654-28f5-43b2-acb8-a7eb01f70b55" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1419.437939] env[62277]: DEBUG oslo_concurrency.lockutils [None req-06fd2190-8827-44cb-9e1d-adc9bd43f0c5 tempest-VolumesAdminNegativeTest-1493025573 tempest-VolumesAdminNegativeTest-1493025573-project-member] Acquiring lock "ad6f7885-4dbd-4306-baf2-b75cc43276d3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1419.438644] env[62277]: DEBUG oslo_concurrency.lockutils [None req-06fd2190-8827-44cb-9e1d-adc9bd43f0c5 tempest-VolumesAdminNegativeTest-1493025573 tempest-VolumesAdminNegativeTest-1493025573-project-member] Lock "ad6f7885-4dbd-4306-baf2-b75cc43276d3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.005s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1419.774530] env[62277]: DEBUG oslo_concurrency.lockutils [None req-88dfee0e-d576-41c1-b681-1766f46949a4 tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Acquiring lock "32bed248-06d5-47a1-b281-47921d99dbf6" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1422.704149] env[62277]: DEBUG oslo_concurrency.lockutils [None req-658d427f-3d3b-4040-a5fe-541cd7995905 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Acquiring lock "7a91e375-b4c6-4e05-a631-dae60926de1a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1422.704464] env[62277]: DEBUG oslo_concurrency.lockutils [None req-658d427f-3d3b-4040-a5fe-541cd7995905 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Lock "7a91e375-b4c6-4e05-a631-dae60926de1a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1423.575029] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bb84464e-5140-4875-9c67-d5873702ff38 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Acquiring lock "26dc82d6-bc3c-4b53-8fff-64578be0d404" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1423.576096] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bb84464e-5140-4875-9c67-d5873702ff38 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Lock "26dc82d6-bc3c-4b53-8fff-64578be0d404" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1428.398940] env[62277]: WARNING oslo_vmware.rw_handles [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1428.398940] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1428.398940] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1428.398940] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1428.398940] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1428.398940] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1428.398940] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1428.398940] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1428.398940] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1428.398940] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1428.398940] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1428.398940] env[62277]: ERROR oslo_vmware.rw_handles [ 1428.399606] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/00af3c49-1285-4aa2-b297-51bf8b2f0020/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1428.401513] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1428.401780] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Copying Virtual Disk [datastore2] vmware_temp/00af3c49-1285-4aa2-b297-51bf8b2f0020/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/00af3c49-1285-4aa2-b297-51bf8b2f0020/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1428.402103] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4176eb66-6864-4b82-9697-1b582db033b9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1428.410420] env[62277]: DEBUG oslo_vmware.api [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Waiting for the task: (returnval){ [ 1428.410420] env[62277]: value = "task-1405394" [ 1428.410420] env[62277]: _type = "Task" [ 1428.410420] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1428.420711] env[62277]: DEBUG oslo_vmware.api [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Task: {'id': task-1405394, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1428.921317] env[62277]: DEBUG oslo_vmware.exceptions [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1428.921764] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1428.922553] env[62277]: ERROR nova.compute.manager [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1428.922553] env[62277]: Faults: ['InvalidArgument'] [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Traceback (most recent call last): [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] yield resources [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] self.driver.spawn(context, instance, image_meta, [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] self._fetch_image_if_missing(context, vi) [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] image_cache(vi, tmp_image_ds_loc) [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] vm_util.copy_virtual_disk( [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] session._wait_for_task(vmdk_copy_task) [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] return self.wait_for_task(task_ref) [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] return evt.wait() [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] result = hub.switch() [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] return self.greenlet.switch() [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] self.f(*self.args, **self.kw) [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] raise exceptions.translate_fault(task_info.error) [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Faults: ['InvalidArgument'] [ 1428.922553] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] [ 1428.924088] env[62277]: INFO nova.compute.manager [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Terminating instance [ 1428.925726] env[62277]: DEBUG oslo_concurrency.lockutils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1428.925726] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1428.926438] env[62277]: DEBUG nova.compute.manager [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1428.926582] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1428.926788] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2e722576-b9b4-4494-8c9b-3d4ff4f5804e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1428.929878] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d136ca24-5197-4032-92c8-49594089f884 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1428.937355] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1428.937611] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f551efd7-8130-4e6b-9835-98b1190697db {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1428.940013] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1428.940193] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1428.941174] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7b565d04-19d9-425d-b825-1b5746cab68f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1428.946877] env[62277]: DEBUG oslo_vmware.api [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Waiting for the task: (returnval){ [ 1428.946877] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]520ef7c6-e7e9-f58d-0f86-fdcf61940b3b" [ 1428.946877] env[62277]: _type = "Task" [ 1428.946877] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1428.958785] env[62277]: DEBUG oslo_vmware.api [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]520ef7c6-e7e9-f58d-0f86-fdcf61940b3b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1429.015243] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1429.015243] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1429.015243] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Deleting the datastore file [datastore2] 866c4415-caab-4d81-86ba-ed662feb3c4f {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1429.015243] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-caec69c7-f43b-4979-b06d-7102ffad22d7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1429.022573] env[62277]: DEBUG oslo_vmware.api [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Waiting for the task: (returnval){ [ 1429.022573] env[62277]: value = "task-1405396" [ 1429.022573] env[62277]: _type = "Task" [ 1429.022573] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1429.033024] env[62277]: DEBUG oslo_vmware.api [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Task: {'id': task-1405396, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1429.228612] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1cccbcb6-e42b-448d-8c4b-38ff0573d14d tempest-ListServersNegativeTestJSON-1271685526 tempest-ListServersNegativeTestJSON-1271685526-project-member] Acquiring lock "7e8d3614-e021-4509-87c7-1c4d68c3e570" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1429.229100] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1cccbcb6-e42b-448d-8c4b-38ff0573d14d tempest-ListServersNegativeTestJSON-1271685526 tempest-ListServersNegativeTestJSON-1271685526-project-member] Lock "7e8d3614-e021-4509-87c7-1c4d68c3e570" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1429.260827] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1cccbcb6-e42b-448d-8c4b-38ff0573d14d tempest-ListServersNegativeTestJSON-1271685526 tempest-ListServersNegativeTestJSON-1271685526-project-member] Acquiring lock "c3af2c56-c745-43e4-9f1b-77937a1d2559" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1429.261072] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1cccbcb6-e42b-448d-8c4b-38ff0573d14d tempest-ListServersNegativeTestJSON-1271685526 tempest-ListServersNegativeTestJSON-1271685526-project-member] Lock "c3af2c56-c745-43e4-9f1b-77937a1d2559" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1429.296477] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1cccbcb6-e42b-448d-8c4b-38ff0573d14d tempest-ListServersNegativeTestJSON-1271685526 tempest-ListServersNegativeTestJSON-1271685526-project-member] Acquiring lock "b8467237-7a59-4302-9dd3-0cbdfc813753" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1429.296780] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1cccbcb6-e42b-448d-8c4b-38ff0573d14d tempest-ListServersNegativeTestJSON-1271685526 tempest-ListServersNegativeTestJSON-1271685526-project-member] Lock "b8467237-7a59-4302-9dd3-0cbdfc813753" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1429.457779] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1429.458080] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Creating directory with path [datastore2] vmware_temp/c20b091c-8db7-4931-9a4b-b24368bb1f79/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1429.458341] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2d895566-9b3f-40b4-bbe1-99ce46d24077 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1429.471448] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Created directory with path [datastore2] vmware_temp/c20b091c-8db7-4931-9a4b-b24368bb1f79/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1429.471712] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Fetch image to [datastore2] vmware_temp/c20b091c-8db7-4931-9a4b-b24368bb1f79/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1429.471987] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/c20b091c-8db7-4931-9a4b-b24368bb1f79/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1429.472795] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-298d8ae2-c2f6-4aa7-85fb-0794ad06b00a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1429.480431] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a848dc1-259d-4550-8d25-1e8c36481e54 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1429.489850] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13400299-6fcb-4d1a-ac17-cd50ca11bc6b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1429.529410] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65ecd407-3799-49fe-9b56-7c23c2759c57 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1429.536825] env[62277]: DEBUG oslo_vmware.api [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Task: {'id': task-1405396, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079238} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1429.538391] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1429.538629] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1429.538851] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1429.539086] env[62277]: INFO nova.compute.manager [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1429.541166] env[62277]: DEBUG nova.compute.claims [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1429.541701] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1429.541986] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1429.544549] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b11d10dd-3248-4b35-9903-ef09b7959e1c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1429.567431] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1429.640274] env[62277]: DEBUG oslo_vmware.rw_handles [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c20b091c-8db7-4931-9a4b-b24368bb1f79/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1429.711866] env[62277]: DEBUG oslo_vmware.rw_handles [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1429.711866] env[62277]: DEBUG oslo_vmware.rw_handles [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c20b091c-8db7-4931-9a4b-b24368bb1f79/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1430.185133] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db8e7690-af44-4884-9d4d-6f07b5242427 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1430.193197] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d59d54c-ed8e-4fe0-8c38-9659d359f381 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1430.224826] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8538f982-c699-4a42-9ea7-37a808628d73 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1430.233939] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cab7e9c-28df-446c-9634-996735dff52e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1430.251603] env[62277]: DEBUG nova.compute.provider_tree [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1430.269733] env[62277]: DEBUG nova.scheduler.client.report [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1430.288695] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.747s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1430.289277] env[62277]: ERROR nova.compute.manager [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1430.289277] env[62277]: Faults: ['InvalidArgument'] [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Traceback (most recent call last): [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] self.driver.spawn(context, instance, image_meta, [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] self._fetch_image_if_missing(context, vi) [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] image_cache(vi, tmp_image_ds_loc) [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] vm_util.copy_virtual_disk( [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] session._wait_for_task(vmdk_copy_task) [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] return self.wait_for_task(task_ref) [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] return evt.wait() [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] result = hub.switch() [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] return self.greenlet.switch() [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] self.f(*self.args, **self.kw) [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] raise exceptions.translate_fault(task_info.error) [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Faults: ['InvalidArgument'] [ 1430.289277] env[62277]: ERROR nova.compute.manager [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] [ 1430.290013] env[62277]: DEBUG nova.compute.utils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1430.293111] env[62277]: DEBUG nova.compute.manager [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Build of instance 866c4415-caab-4d81-86ba-ed662feb3c4f was re-scheduled: A specified parameter was not correct: fileType [ 1430.293111] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1430.293111] env[62277]: DEBUG nova.compute.manager [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1430.293111] env[62277]: DEBUG nova.compute.manager [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1430.293566] env[62277]: DEBUG nova.compute.manager [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1430.293765] env[62277]: DEBUG nova.network.neutron [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1430.794773] env[62277]: DEBUG nova.network.neutron [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1430.816478] env[62277]: INFO nova.compute.manager [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Took 0.52 seconds to deallocate network for instance. [ 1430.969941] env[62277]: INFO nova.scheduler.client.report [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Deleted allocations for instance 866c4415-caab-4d81-86ba-ed662feb3c4f [ 1430.992390] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5b69444e-1797-4ff9-b5f9-a82c69b3a351 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Lock "866c4415-caab-4d81-86ba-ed662feb3c4f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 421.266s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1430.996392] env[62277]: DEBUG oslo_concurrency.lockutils [None req-94ebb12a-e413-4ba8-b304-6c4ff8e832a5 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Lock "866c4415-caab-4d81-86ba-ed662feb3c4f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 222.518s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1430.996392] env[62277]: DEBUG oslo_concurrency.lockutils [None req-94ebb12a-e413-4ba8-b304-6c4ff8e832a5 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Acquiring lock "866c4415-caab-4d81-86ba-ed662feb3c4f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1430.996392] env[62277]: DEBUG oslo_concurrency.lockutils [None req-94ebb12a-e413-4ba8-b304-6c4ff8e832a5 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Lock "866c4415-caab-4d81-86ba-ed662feb3c4f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1430.996392] env[62277]: DEBUG oslo_concurrency.lockutils [None req-94ebb12a-e413-4ba8-b304-6c4ff8e832a5 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Lock "866c4415-caab-4d81-86ba-ed662feb3c4f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1431.002790] env[62277]: INFO nova.compute.manager [None req-94ebb12a-e413-4ba8-b304-6c4ff8e832a5 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Terminating instance [ 1431.005651] env[62277]: DEBUG nova.compute.manager [None req-94ebb12a-e413-4ba8-b304-6c4ff8e832a5 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1431.005741] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-94ebb12a-e413-4ba8-b304-6c4ff8e832a5 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1431.008106] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-02bf55ed-1c96-490f-ad37-9d1d0d2ce9a4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1431.009558] env[62277]: DEBUG nova.compute.manager [None req-5ffc1411-5f8e-44bd-8435-e2c993f18ac8 tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] [instance: e573e784-3318-4a41-89fd-40cbe8749413] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1431.019155] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63dabe78-3973-427e-a3dc-fda5250d8a26 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1431.052327] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-94ebb12a-e413-4ba8-b304-6c4ff8e832a5 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 866c4415-caab-4d81-86ba-ed662feb3c4f could not be found. [ 1431.052941] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-94ebb12a-e413-4ba8-b304-6c4ff8e832a5 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1431.052941] env[62277]: INFO nova.compute.manager [None req-94ebb12a-e413-4ba8-b304-6c4ff8e832a5 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1431.052941] env[62277]: DEBUG oslo.service.loopingcall [None req-94ebb12a-e413-4ba8-b304-6c4ff8e832a5 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1431.054153] env[62277]: DEBUG nova.compute.manager [None req-5ffc1411-5f8e-44bd-8435-e2c993f18ac8 tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] [instance: e573e784-3318-4a41-89fd-40cbe8749413] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1431.054260] env[62277]: DEBUG nova.compute.manager [-] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1431.054366] env[62277]: DEBUG nova.network.neutron [-] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1431.078248] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5ffc1411-5f8e-44bd-8435-e2c993f18ac8 tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] Lock "e573e784-3318-4a41-89fd-40cbe8749413" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.186s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1431.093018] env[62277]: DEBUG nova.network.neutron [-] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1431.093018] env[62277]: DEBUG nova.compute.manager [None req-5ffc1411-5f8e-44bd-8435-e2c993f18ac8 tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] [instance: 06571cd1-61a1-48e9-a204-624d5f383ad3] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1431.100824] env[62277]: INFO nova.compute.manager [-] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] Took 0.05 seconds to deallocate network for instance. [ 1431.133687] env[62277]: DEBUG nova.compute.manager [None req-5ffc1411-5f8e-44bd-8435-e2c993f18ac8 tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] [instance: 06571cd1-61a1-48e9-a204-624d5f383ad3] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1431.162867] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5ffc1411-5f8e-44bd-8435-e2c993f18ac8 tempest-MultipleCreateTestJSON-1044264016 tempest-MultipleCreateTestJSON-1044264016-project-member] Lock "06571cd1-61a1-48e9-a204-624d5f383ad3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.243s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1431.176649] env[62277]: DEBUG nova.compute.manager [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1431.242841] env[62277]: DEBUG oslo_concurrency.lockutils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1431.243091] env[62277]: DEBUG oslo_concurrency.lockutils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1431.244574] env[62277]: INFO nova.compute.claims [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1431.253494] env[62277]: DEBUG oslo_concurrency.lockutils [None req-94ebb12a-e413-4ba8-b304-6c4ff8e832a5 tempest-ServerDiagnosticsNegativeTest-2142209624 tempest-ServerDiagnosticsNegativeTest-2142209624-project-member] Lock "866c4415-caab-4d81-86ba-ed662feb3c4f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.260s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1431.254321] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "866c4415-caab-4d81-86ba-ed662feb3c4f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 183.117s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1431.254499] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 866c4415-caab-4d81-86ba-ed662feb3c4f] During sync_power_state the instance has a pending task (deleting). Skip. [ 1431.254662] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "866c4415-caab-4d81-86ba-ed662feb3c4f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1431.801597] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a00ba7f-b3e2-4101-8b10-ddeaa2a5a825 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1431.809588] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-246d2066-5fe7-4d92-8c8f-f3c9028d077d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1431.853078] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54ad48bb-0099-4d3c-bdc6-306123697487 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1431.864931] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0681d371-8462-4c91-9abc-3d463c5f88d4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1431.887189] env[62277]: DEBUG nova.compute.provider_tree [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1431.896974] env[62277]: DEBUG nova.scheduler.client.report [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1431.916995] env[62277]: DEBUG oslo_concurrency.lockutils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.674s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1431.917776] env[62277]: DEBUG nova.compute.manager [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1431.958463] env[62277]: DEBUG nova.compute.utils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1431.959721] env[62277]: DEBUG nova.compute.manager [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1431.959893] env[62277]: DEBUG nova.network.neutron [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1431.971114] env[62277]: DEBUG nova.compute.manager [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1432.033507] env[62277]: DEBUG nova.policy [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7d934ed57e6a430c9afccf259f146ff2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8be0173ecd624df0b76957f641d96c9d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1432.045463] env[62277]: DEBUG nova.compute.manager [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1432.077259] env[62277]: DEBUG nova.virt.hardware [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1432.077259] env[62277]: DEBUG nova.virt.hardware [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1432.077259] env[62277]: DEBUG nova.virt.hardware [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1432.077259] env[62277]: DEBUG nova.virt.hardware [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1432.077259] env[62277]: DEBUG nova.virt.hardware [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1432.077259] env[62277]: DEBUG nova.virt.hardware [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1432.077259] env[62277]: DEBUG nova.virt.hardware [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1432.077259] env[62277]: DEBUG nova.virt.hardware [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1432.077259] env[62277]: DEBUG nova.virt.hardware [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1432.077678] env[62277]: DEBUG nova.virt.hardware [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1432.078063] env[62277]: DEBUG nova.virt.hardware [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1432.079464] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78789d01-47a8-4dc0-9fe6-4c757e780a28 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.088612] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-187015ce-09b7-44bf-8e76-bbac8a07fd4e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.421867] env[62277]: DEBUG nova.network.neutron [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Successfully created port: a9b1ba4c-34a0-443e-9571-64363d20760c {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1433.386987] env[62277]: DEBUG nova.compute.manager [req-75ee3410-da84-4ccb-917d-e1b035a2ca6f req-c062dcb7-7f44-4dee-9508-e13ceb5d7d33 service nova] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Received event network-vif-plugged-a9b1ba4c-34a0-443e-9571-64363d20760c {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1433.387252] env[62277]: DEBUG oslo_concurrency.lockutils [req-75ee3410-da84-4ccb-917d-e1b035a2ca6f req-c062dcb7-7f44-4dee-9508-e13ceb5d7d33 service nova] Acquiring lock "8d00162c-7379-48b6-841b-f802db2582db-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1433.387525] env[62277]: DEBUG oslo_concurrency.lockutils [req-75ee3410-da84-4ccb-917d-e1b035a2ca6f req-c062dcb7-7f44-4dee-9508-e13ceb5d7d33 service nova] Lock "8d00162c-7379-48b6-841b-f802db2582db-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1433.387731] env[62277]: DEBUG oslo_concurrency.lockutils [req-75ee3410-da84-4ccb-917d-e1b035a2ca6f req-c062dcb7-7f44-4dee-9508-e13ceb5d7d33 service nova] Lock "8d00162c-7379-48b6-841b-f802db2582db-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1433.387903] env[62277]: DEBUG nova.compute.manager [req-75ee3410-da84-4ccb-917d-e1b035a2ca6f req-c062dcb7-7f44-4dee-9508-e13ceb5d7d33 service nova] [instance: 8d00162c-7379-48b6-841b-f802db2582db] No waiting events found dispatching network-vif-plugged-a9b1ba4c-34a0-443e-9571-64363d20760c {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1433.388195] env[62277]: WARNING nova.compute.manager [req-75ee3410-da84-4ccb-917d-e1b035a2ca6f req-c062dcb7-7f44-4dee-9508-e13ceb5d7d33 service nova] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Received unexpected event network-vif-plugged-a9b1ba4c-34a0-443e-9571-64363d20760c for instance with vm_state building and task_state spawning. [ 1433.489678] env[62277]: DEBUG nova.network.neutron [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Successfully updated port: a9b1ba4c-34a0-443e-9571-64363d20760c {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1433.500784] env[62277]: DEBUG oslo_concurrency.lockutils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Acquiring lock "refresh_cache-8d00162c-7379-48b6-841b-f802db2582db" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1433.500951] env[62277]: DEBUG oslo_concurrency.lockutils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Acquired lock "refresh_cache-8d00162c-7379-48b6-841b-f802db2582db" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1433.501114] env[62277]: DEBUG nova.network.neutron [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1433.554145] env[62277]: DEBUG nova.network.neutron [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1433.727420] env[62277]: DEBUG nova.network.neutron [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Updating instance_info_cache with network_info: [{"id": "a9b1ba4c-34a0-443e-9571-64363d20760c", "address": "fa:16:3e:46:33:59", "network": {"id": "a2fe7804-cb8a-4be5-ab52-2389c01a1954", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-496298938-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8be0173ecd624df0b76957f641d96c9d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "359c2c31-99c4-41d7-a513-3bc4825897a0", "external-id": "nsx-vlan-transportzone-173", "segmentation_id": 173, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa9b1ba4c-34", "ovs_interfaceid": "a9b1ba4c-34a0-443e-9571-64363d20760c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1433.739562] env[62277]: DEBUG oslo_concurrency.lockutils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Releasing lock "refresh_cache-8d00162c-7379-48b6-841b-f802db2582db" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1433.740042] env[62277]: DEBUG nova.compute.manager [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Instance network_info: |[{"id": "a9b1ba4c-34a0-443e-9571-64363d20760c", "address": "fa:16:3e:46:33:59", "network": {"id": "a2fe7804-cb8a-4be5-ab52-2389c01a1954", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-496298938-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8be0173ecd624df0b76957f641d96c9d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "359c2c31-99c4-41d7-a513-3bc4825897a0", "external-id": "nsx-vlan-transportzone-173", "segmentation_id": 173, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa9b1ba4c-34", "ovs_interfaceid": "a9b1ba4c-34a0-443e-9571-64363d20760c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1433.740436] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:46:33:59', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '359c2c31-99c4-41d7-a513-3bc4825897a0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a9b1ba4c-34a0-443e-9571-64363d20760c', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1433.748068] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Creating folder: Project (8be0173ecd624df0b76957f641d96c9d). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1433.748655] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f94ea6ed-11a2-4de6-a34a-f4a657aabee4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.758687] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Created folder: Project (8be0173ecd624df0b76957f641d96c9d) in parent group-v297781. [ 1433.758877] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Creating folder: Instances. Parent ref: group-v297847. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1433.759121] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2eedbc2a-3796-4cbf-835b-986a1315b58a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.767118] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Created folder: Instances in parent group-v297847. [ 1433.767338] env[62277]: DEBUG oslo.service.loopingcall [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1433.767510] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1433.767699] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6736dccf-45db-4dc4-adef-964af72ba30c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.787268] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1433.787268] env[62277]: value = "task-1405402" [ 1433.787268] env[62277]: _type = "Task" [ 1433.787268] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1433.794398] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405402, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1434.296396] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405402, 'name': CreateVM_Task, 'duration_secs': 0.303201} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1434.296567] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1434.297226] env[62277]: DEBUG oslo_concurrency.lockutils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1434.297386] env[62277]: DEBUG oslo_concurrency.lockutils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1434.297714] env[62277]: DEBUG oslo_concurrency.lockutils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1434.297952] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c72ad3d0-681b-4709-a46a-742b41c2b455 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1434.302017] env[62277]: DEBUG oslo_vmware.api [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Waiting for the task: (returnval){ [ 1434.302017] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52323284-7249-92ac-30bc-6c811df85481" [ 1434.302017] env[62277]: _type = "Task" [ 1434.302017] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1434.309207] env[62277]: DEBUG oslo_vmware.api [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52323284-7249-92ac-30bc-6c811df85481, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1434.813083] env[62277]: DEBUG oslo_concurrency.lockutils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1434.813083] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1434.813083] env[62277]: DEBUG oslo_concurrency.lockutils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1435.419996] env[62277]: DEBUG nova.compute.manager [req-25b6282f-e479-4e0f-991d-d9899ce0172c req-530704ae-f671-41f6-b918-829099325f3e service nova] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Received event network-changed-a9b1ba4c-34a0-443e-9571-64363d20760c {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1435.420200] env[62277]: DEBUG nova.compute.manager [req-25b6282f-e479-4e0f-991d-d9899ce0172c req-530704ae-f671-41f6-b918-829099325f3e service nova] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Refreshing instance network info cache due to event network-changed-a9b1ba4c-34a0-443e-9571-64363d20760c. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1435.420416] env[62277]: DEBUG oslo_concurrency.lockutils [req-25b6282f-e479-4e0f-991d-d9899ce0172c req-530704ae-f671-41f6-b918-829099325f3e service nova] Acquiring lock "refresh_cache-8d00162c-7379-48b6-841b-f802db2582db" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1435.420555] env[62277]: DEBUG oslo_concurrency.lockutils [req-25b6282f-e479-4e0f-991d-d9899ce0172c req-530704ae-f671-41f6-b918-829099325f3e service nova] Acquired lock "refresh_cache-8d00162c-7379-48b6-841b-f802db2582db" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1435.420709] env[62277]: DEBUG nova.network.neutron [req-25b6282f-e479-4e0f-991d-d9899ce0172c req-530704ae-f671-41f6-b918-829099325f3e service nova] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Refreshing network info cache for port a9b1ba4c-34a0-443e-9571-64363d20760c {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1435.739385] env[62277]: DEBUG nova.network.neutron [req-25b6282f-e479-4e0f-991d-d9899ce0172c req-530704ae-f671-41f6-b918-829099325f3e service nova] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Updated VIF entry in instance network info cache for port a9b1ba4c-34a0-443e-9571-64363d20760c. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1435.739854] env[62277]: DEBUG nova.network.neutron [req-25b6282f-e479-4e0f-991d-d9899ce0172c req-530704ae-f671-41f6-b918-829099325f3e service nova] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Updating instance_info_cache with network_info: [{"id": "a9b1ba4c-34a0-443e-9571-64363d20760c", "address": "fa:16:3e:46:33:59", "network": {"id": "a2fe7804-cb8a-4be5-ab52-2389c01a1954", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-496298938-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8be0173ecd624df0b76957f641d96c9d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "359c2c31-99c4-41d7-a513-3bc4825897a0", "external-id": "nsx-vlan-transportzone-173", "segmentation_id": 173, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa9b1ba4c-34", "ovs_interfaceid": "a9b1ba4c-34a0-443e-9571-64363d20760c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1435.752148] env[62277]: DEBUG oslo_concurrency.lockutils [req-25b6282f-e479-4e0f-991d-d9899ce0172c req-530704ae-f671-41f6-b918-829099325f3e service nova] Releasing lock "refresh_cache-8d00162c-7379-48b6-841b-f802db2582db" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1436.192574] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a00ef9d3-8870-42b4-a406-76583c695098 tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Acquiring lock "8d00162c-7379-48b6-841b-f802db2582db" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1442.934164] env[62277]: DEBUG oslo_concurrency.lockutils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Acquiring lock "a7cc7e45-8567-4699-af83-624b1c7c5c64" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1442.934523] env[62277]: DEBUG oslo_concurrency.lockutils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Lock "a7cc7e45-8567-4699-af83-624b1c7c5c64" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1451.930460] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4fa6a66c-d9d1-4bdc-bcb0-922a4670790e tempest-ServerActionsV293TestJSON-1090793783 tempest-ServerActionsV293TestJSON-1090793783-project-member] Acquiring lock "24ee6c71-7267-4fe2-8ac4-84bf1d00c024" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1451.930747] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4fa6a66c-d9d1-4bdc-bcb0-922a4670790e tempest-ServerActionsV293TestJSON-1090793783 tempest-ServerActionsV293TestJSON-1090793783-project-member] Lock "24ee6c71-7267-4fe2-8ac4-84bf1d00c024" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1452.919139] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e9443422-ad1c-4641-a2d4-d11014892d36 tempest-ServerMetadataNegativeTestJSON-1764342254 tempest-ServerMetadataNegativeTestJSON-1764342254-project-member] Acquiring lock "f468d0c6-35ed-4f8d-a3dc-aea9462aa7bb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1452.919385] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e9443422-ad1c-4641-a2d4-d11014892d36 tempest-ServerMetadataNegativeTestJSON-1764342254 tempest-ServerMetadataNegativeTestJSON-1764342254-project-member] Lock "f468d0c6-35ed-4f8d-a3dc-aea9462aa7bb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1454.237151] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0fbd05d1-7e13-4efb-9287-1c5673b04c42 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "8167915d-ed3a-44b7-8eff-d585e7f6ffbf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1454.237440] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0fbd05d1-7e13-4efb-9287-1c5673b04c42 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "8167915d-ed3a-44b7-8eff-d585e7f6ffbf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1456.779401] env[62277]: DEBUG oslo_concurrency.lockutils [None req-678307d2-4680-407a-b1a4-a1159637463f tempest-ServerGroupTestJSON-441783187 tempest-ServerGroupTestJSON-441783187-project-member] Acquiring lock "9fedbb74-ae57-4cb8-8496-2ff9c703b46e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1456.779737] env[62277]: DEBUG oslo_concurrency.lockutils [None req-678307d2-4680-407a-b1a4-a1159637463f tempest-ServerGroupTestJSON-441783187 tempest-ServerGroupTestJSON-441783187-project-member] Lock "9fedbb74-ae57-4cb8-8496-2ff9c703b46e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1460.168659] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1461.168545] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1461.168993] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 1461.169226] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 1461.195204] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1461.195368] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1461.195498] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1461.195686] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1461.195830] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1461.195953] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1461.196082] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1461.196199] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1461.196316] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1461.196431] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1461.196556] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 1461.197134] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1462.168867] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1462.169095] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1463.168485] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1464.169821] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1464.179583] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1464.179803] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1464.179968] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1464.180133] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1464.181282] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a6506c3-08de-46d4-a5e9-387037aa86ce {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.190778] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a232413-04dd-471b-913e-f9bc478d4c6e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.205177] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f1723d8-45fe-4165-b0f1-dac6b387b2c4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.211776] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ef93959-cb55-4fd7-b7f7-e600109e6104 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.240503] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181435MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1464.240615] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1464.240805] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1464.313320] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 350e2302-66b9-4dd6-b0f4-77000992408b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1464.313469] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 23bc5a48-3e96-4897-bf28-ad14a0bdde62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1464.313594] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 346748bd-b4e8-4e93-b71d-66c90a45e372 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1464.313713] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 930ff058-ab48-4c8a-8f5e-4820a3b12d50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1464.313831] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1464.313945] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 1e8429b2-7149-4832-8590-e0ebd8501176 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1464.314077] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 21bd4623-2b46-43c4-859f-c4d3bf261e1f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1464.314192] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6d759045-e1fc-43ea-a882-1ead769b6d29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1464.314338] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 32bed248-06d5-47a1-b281-47921d99dbf6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1464.314408] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8d00162c-7379-48b6-841b-f802db2582db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1464.325169] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 308e5c48-c452-4dbf-94b0-1eb12951e620 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.335382] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 39172747-1245-473d-9f18-87bae208b5b1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.345704] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 7ecfbbee-4955-4704-af62-ce8f5470cfbe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.356334] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4ca037fc-9a4e-413b-9b4e-2122f5a4fe18 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.366076] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 900160c8-a715-45a4-8709-b314fc3216d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.375846] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 0901d48b-bc88-461b-8503-eb6b51c39148 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.385149] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 63267d5c-d004-41c1-866a-75b9e37521b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.395309] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 13959890-87a1-45ba-98de-621373e265e7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.406948] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance b2ad5654-28f5-43b2-acb8-a7eb01f70b55 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.418566] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance ad6f7885-4dbd-4306-baf2-b75cc43276d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.430022] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 7a91e375-b4c6-4e05-a631-dae60926de1a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.440711] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 26dc82d6-bc3c-4b53-8fff-64578be0d404 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.450720] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 7e8d3614-e021-4509-87c7-1c4d68c3e570 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.460553] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c3af2c56-c745-43e4-9f1b-77937a1d2559 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.470104] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance b8467237-7a59-4302-9dd3-0cbdfc813753 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.479384] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance a7cc7e45-8567-4699-af83-624b1c7c5c64 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.491042] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 24ee6c71-7267-4fe2-8ac4-84bf1d00c024 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.503054] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance f468d0c6-35ed-4f8d-a3dc-aea9462aa7bb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.513367] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8167915d-ed3a-44b7-8eff-d585e7f6ffbf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.525470] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9fedbb74-ae57-4cb8-8496-2ff9c703b46e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1464.525785] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1464.525947] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1464.863221] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-510d48d4-5056-4689-8837-ed5982b0ff46 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.871378] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17758691-18f0-4e16-aafd-0b8909d84a2d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.902196] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89a65574-37bb-49ec-8cf3-c1c15de4a6ec {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.910440] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cc13991-cf9e-478f-8e9c-4baad496827a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.924692] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1464.932970] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1464.947644] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1464.947644] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.707s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1465.942905] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1465.964482] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1467.168613] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1467.168613] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 1479.261986] env[62277]: WARNING oslo_vmware.rw_handles [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1479.261986] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1479.261986] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1479.261986] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1479.261986] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1479.261986] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1479.261986] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1479.261986] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1479.261986] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1479.261986] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1479.261986] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1479.261986] env[62277]: ERROR oslo_vmware.rw_handles [ 1479.262584] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/c20b091c-8db7-4931-9a4b-b24368bb1f79/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1479.266043] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1479.266310] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Copying Virtual Disk [datastore2] vmware_temp/c20b091c-8db7-4931-9a4b-b24368bb1f79/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/c20b091c-8db7-4931-9a4b-b24368bb1f79/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1479.266596] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d95026ec-270c-422b-9f37-eab9debd3a8b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1479.276377] env[62277]: DEBUG oslo_vmware.api [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Waiting for the task: (returnval){ [ 1479.276377] env[62277]: value = "task-1405410" [ 1479.276377] env[62277]: _type = "Task" [ 1479.276377] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1479.285393] env[62277]: DEBUG oslo_vmware.api [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Task: {'id': task-1405410, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1479.789550] env[62277]: DEBUG oslo_vmware.exceptions [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1479.790134] env[62277]: DEBUG oslo_concurrency.lockutils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1479.790844] env[62277]: ERROR nova.compute.manager [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1479.790844] env[62277]: Faults: ['InvalidArgument'] [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Traceback (most recent call last): [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] yield resources [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] self.driver.spawn(context, instance, image_meta, [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] self._fetch_image_if_missing(context, vi) [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] image_cache(vi, tmp_image_ds_loc) [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] vm_util.copy_virtual_disk( [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] session._wait_for_task(vmdk_copy_task) [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] return self.wait_for_task(task_ref) [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] return evt.wait() [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] result = hub.switch() [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] return self.greenlet.switch() [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] self.f(*self.args, **self.kw) [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] raise exceptions.translate_fault(task_info.error) [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Faults: ['InvalidArgument'] [ 1479.790844] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] [ 1479.791689] env[62277]: INFO nova.compute.manager [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Terminating instance [ 1479.793796] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1479.793796] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1479.793796] env[62277]: DEBUG nova.compute.manager [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1479.793946] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1479.794109] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5a885628-f680-4080-b9d6-b8df445a38fb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1479.796679] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a94fb156-5a40-4c6c-a0e3-b682880e2872 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1479.804054] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1479.804288] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-dca615e8-11dd-4c9b-a686-8b3e885f4ccd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1479.808745] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1479.808919] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1479.809906] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5bc2e67c-5b80-4675-8de5-19f7f346af96 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1479.823163] env[62277]: DEBUG oslo_vmware.api [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Waiting for the task: (returnval){ [ 1479.823163] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]527bc789-872f-a8b8-2a5f-1198fe5700b9" [ 1479.823163] env[62277]: _type = "Task" [ 1479.823163] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1479.834239] env[62277]: DEBUG oslo_vmware.api [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]527bc789-872f-a8b8-2a5f-1198fe5700b9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1479.883838] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1479.884329] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1479.884597] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Deleting the datastore file [datastore2] 350e2302-66b9-4dd6-b0f4-77000992408b {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1479.888406] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-581ff62f-a5e3-4db6-ba47-83b388054f0d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1479.896443] env[62277]: DEBUG oslo_vmware.api [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Waiting for the task: (returnval){ [ 1479.896443] env[62277]: value = "task-1405412" [ 1479.896443] env[62277]: _type = "Task" [ 1479.896443] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1479.904845] env[62277]: DEBUG oslo_vmware.api [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Task: {'id': task-1405412, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1480.337146] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1480.337459] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Creating directory with path [datastore2] vmware_temp/9fe2aab3-b335-4b7a-8f9c-199fc9846516/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1480.337648] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-18659677-a969-4f0c-a34f-f18435a0360e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1480.349962] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Created directory with path [datastore2] vmware_temp/9fe2aab3-b335-4b7a-8f9c-199fc9846516/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1480.350207] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Fetch image to [datastore2] vmware_temp/9fe2aab3-b335-4b7a-8f9c-199fc9846516/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1480.350378] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/9fe2aab3-b335-4b7a-8f9c-199fc9846516/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1480.351151] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a28e7fc-7f2f-4edc-8178-1f669ab01d87 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1480.361323] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9260d91-10c8-4206-b1dd-c68e12d531b2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1480.370774] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fb10512-d2ca-4902-b69f-16a2e406d611 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1480.408707] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6568c936-607a-40ff-b903-1459102f54a2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1480.418917] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7fe8b054-eeb4-4200-8fca-573e4c7ad0d7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1480.420973] env[62277]: DEBUG oslo_vmware.api [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Task: {'id': task-1405412, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067535} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1480.421491] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1480.421698] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1480.421883] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1480.422146] env[62277]: INFO nova.compute.manager [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1480.424576] env[62277]: DEBUG nova.compute.claims [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1480.424800] env[62277]: DEBUG oslo_concurrency.lockutils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1480.425093] env[62277]: DEBUG oslo_concurrency.lockutils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1480.445888] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1480.515567] env[62277]: DEBUG oslo_vmware.rw_handles [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9fe2aab3-b335-4b7a-8f9c-199fc9846516/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1480.585097] env[62277]: DEBUG oslo_vmware.rw_handles [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1480.585097] env[62277]: DEBUG oslo_vmware.rw_handles [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9fe2aab3-b335-4b7a-8f9c-199fc9846516/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1480.963999] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a7c8e66-0891-4de0-a751-6cd173bd7079 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1480.980016] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f17df2fd-87c9-4825-a89d-05b1f0b17f14 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1481.009368] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7a2b5a2-ac32-41fd-851c-68398553b787 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1481.017971] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54215709-69c9-450e-aeb5-4262ff348799 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1481.032166] env[62277]: DEBUG nova.compute.provider_tree [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1481.048542] env[62277]: DEBUG nova.scheduler.client.report [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1481.071805] env[62277]: DEBUG oslo_concurrency.lockutils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.644s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1481.071805] env[62277]: ERROR nova.compute.manager [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1481.071805] env[62277]: Faults: ['InvalidArgument'] [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Traceback (most recent call last): [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] self.driver.spawn(context, instance, image_meta, [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] self._fetch_image_if_missing(context, vi) [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] image_cache(vi, tmp_image_ds_loc) [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] vm_util.copy_virtual_disk( [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] session._wait_for_task(vmdk_copy_task) [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] return self.wait_for_task(task_ref) [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] return evt.wait() [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] result = hub.switch() [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] return self.greenlet.switch() [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] self.f(*self.args, **self.kw) [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] raise exceptions.translate_fault(task_info.error) [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Faults: ['InvalidArgument'] [ 1481.071805] env[62277]: ERROR nova.compute.manager [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] [ 1481.072756] env[62277]: DEBUG nova.compute.utils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1481.072756] env[62277]: DEBUG nova.compute.manager [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Build of instance 350e2302-66b9-4dd6-b0f4-77000992408b was re-scheduled: A specified parameter was not correct: fileType [ 1481.072756] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1481.072844] env[62277]: DEBUG nova.compute.manager [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1481.073415] env[62277]: DEBUG nova.compute.manager [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1481.073415] env[62277]: DEBUG nova.compute.manager [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1481.073415] env[62277]: DEBUG nova.network.neutron [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1481.486514] env[62277]: DEBUG nova.network.neutron [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1481.507153] env[62277]: INFO nova.compute.manager [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Took 0.43 seconds to deallocate network for instance. [ 1481.619687] env[62277]: INFO nova.scheduler.client.report [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Deleted allocations for instance 350e2302-66b9-4dd6-b0f4-77000992408b [ 1481.644950] env[62277]: DEBUG oslo_concurrency.lockutils [None req-90eb08c2-c082-4b24-a6a5-349f3b6a49b5 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Lock "350e2302-66b9-4dd6-b0f4-77000992408b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 469.662s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1481.646283] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0871020e-a31b-4cab-abc4-f4a95f73bc14 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Lock "350e2302-66b9-4dd6-b0f4-77000992408b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 272.362s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1481.647083] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0871020e-a31b-4cab-abc4-f4a95f73bc14 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Acquiring lock "350e2302-66b9-4dd6-b0f4-77000992408b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1481.647332] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0871020e-a31b-4cab-abc4-f4a95f73bc14 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Lock "350e2302-66b9-4dd6-b0f4-77000992408b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1481.647567] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0871020e-a31b-4cab-abc4-f4a95f73bc14 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Lock "350e2302-66b9-4dd6-b0f4-77000992408b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1481.649675] env[62277]: INFO nova.compute.manager [None req-0871020e-a31b-4cab-abc4-f4a95f73bc14 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Terminating instance [ 1481.652348] env[62277]: DEBUG nova.compute.manager [None req-0871020e-a31b-4cab-abc4-f4a95f73bc14 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1481.652348] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-0871020e-a31b-4cab-abc4-f4a95f73bc14 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1481.652649] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0080444a-b70e-4bd0-9e6a-290023ce10b1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1481.664538] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e05fb09-e748-44ec-bfeb-271910df5c27 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1481.676656] env[62277]: DEBUG nova.compute.manager [None req-e92c5a32-c7c3-4b88-b3b2-36480963b926 tempest-ServerRescueTestJSON-2122562855 tempest-ServerRescueTestJSON-2122562855-project-member] [instance: c8d02374-bed2-4b4a-9bab-3a3dec87ad3e] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1481.704334] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-0871020e-a31b-4cab-abc4-f4a95f73bc14 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 350e2302-66b9-4dd6-b0f4-77000992408b could not be found. [ 1481.704735] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-0871020e-a31b-4cab-abc4-f4a95f73bc14 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1481.707028] env[62277]: INFO nova.compute.manager [None req-0871020e-a31b-4cab-abc4-f4a95f73bc14 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1481.707028] env[62277]: DEBUG oslo.service.loopingcall [None req-0871020e-a31b-4cab-abc4-f4a95f73bc14 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1481.707028] env[62277]: DEBUG nova.compute.manager [-] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1481.707028] env[62277]: DEBUG nova.network.neutron [-] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1481.710225] env[62277]: DEBUG nova.compute.manager [None req-e92c5a32-c7c3-4b88-b3b2-36480963b926 tempest-ServerRescueTestJSON-2122562855 tempest-ServerRescueTestJSON-2122562855-project-member] [instance: c8d02374-bed2-4b4a-9bab-3a3dec87ad3e] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1481.737260] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e92c5a32-c7c3-4b88-b3b2-36480963b926 tempest-ServerRescueTestJSON-2122562855 tempest-ServerRescueTestJSON-2122562855-project-member] Lock "c8d02374-bed2-4b4a-9bab-3a3dec87ad3e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.484s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1481.753614] env[62277]: DEBUG nova.compute.manager [None req-1548f96a-4393-43bc-95ad-0f75a08c9eaa tempest-ServersV294TestFqdnHostnames-1549436271 tempest-ServersV294TestFqdnHostnames-1549436271-project-member] [instance: df611bf9-45db-4940-a59e-fccc7d96b935] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1481.756272] env[62277]: DEBUG nova.network.neutron [-] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1481.768635] env[62277]: INFO nova.compute.manager [-] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] Took 0.06 seconds to deallocate network for instance. [ 1481.791497] env[62277]: DEBUG nova.compute.manager [None req-1548f96a-4393-43bc-95ad-0f75a08c9eaa tempest-ServersV294TestFqdnHostnames-1549436271 tempest-ServersV294TestFqdnHostnames-1549436271-project-member] [instance: df611bf9-45db-4940-a59e-fccc7d96b935] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1481.827524] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1548f96a-4393-43bc-95ad-0f75a08c9eaa tempest-ServersV294TestFqdnHostnames-1549436271 tempest-ServersV294TestFqdnHostnames-1549436271-project-member] Lock "df611bf9-45db-4940-a59e-fccc7d96b935" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 233.418s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1481.847268] env[62277]: DEBUG nova.compute.manager [None req-eaa6dcfc-5df7-4bce-bf02-13d827a6d37e tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: 5d595f5e-6d35-4c89-a4e2-a3639c6145c8] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1481.880547] env[62277]: DEBUG nova.compute.manager [None req-eaa6dcfc-5df7-4bce-bf02-13d827a6d37e tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] [instance: 5d595f5e-6d35-4c89-a4e2-a3639c6145c8] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1481.901026] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0871020e-a31b-4cab-abc4-f4a95f73bc14 tempest-ImagesOneServerNegativeTestJSON-229607395 tempest-ImagesOneServerNegativeTestJSON-229607395-project-member] Lock "350e2302-66b9-4dd6-b0f4-77000992408b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.252s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1481.901026] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "350e2302-66b9-4dd6-b0f4-77000992408b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 233.762s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1481.901026] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 350e2302-66b9-4dd6-b0f4-77000992408b] During sync_power_state the instance has a pending task (deleting). Skip. [ 1481.901026] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "350e2302-66b9-4dd6-b0f4-77000992408b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1481.909739] env[62277]: DEBUG oslo_concurrency.lockutils [None req-eaa6dcfc-5df7-4bce-bf02-13d827a6d37e tempest-SecurityGroupsTestJSON-463872970 tempest-SecurityGroupsTestJSON-463872970-project-member] Lock "5d595f5e-6d35-4c89-a4e2-a3639c6145c8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.815s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1481.922628] env[62277]: DEBUG nova.compute.manager [None req-02618745-f64d-4b68-821f-0b1b3757a349 tempest-ServersTestManualDisk-209415039 tempest-ServersTestManualDisk-209415039-project-member] [instance: 4fd54f91-dedd-4ce2-8acf-8a2123be73b8] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1481.950519] env[62277]: DEBUG nova.compute.manager [None req-02618745-f64d-4b68-821f-0b1b3757a349 tempest-ServersTestManualDisk-209415039 tempest-ServersTestManualDisk-209415039-project-member] [instance: 4fd54f91-dedd-4ce2-8acf-8a2123be73b8] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1481.974975] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02618745-f64d-4b68-821f-0b1b3757a349 tempest-ServersTestManualDisk-209415039 tempest-ServersTestManualDisk-209415039-project-member] Lock "4fd54f91-dedd-4ce2-8acf-8a2123be73b8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.082s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1481.985732] env[62277]: DEBUG nova.compute.manager [None req-2ac43584-bf89-4422-91e8-9ba1f88bac13 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: baabe4ee-b366-45a8-bf06-cd63f697e7dc] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1482.011245] env[62277]: DEBUG nova.compute.manager [None req-2ac43584-bf89-4422-91e8-9ba1f88bac13 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: baabe4ee-b366-45a8-bf06-cd63f697e7dc] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1482.039966] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2ac43584-bf89-4422-91e8-9ba1f88bac13 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "baabe4ee-b366-45a8-bf06-cd63f697e7dc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.550s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1482.050520] env[62277]: DEBUG nova.compute.manager [None req-3fee9975-0813-4582-a552-f679545608ec tempest-ServerShowV257Test-724719980 tempest-ServerShowV257Test-724719980-project-member] [instance: 5cf06245-3fa1-4596-8260-7a82bc4a1193] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1482.078202] env[62277]: DEBUG nova.compute.manager [None req-3fee9975-0813-4582-a552-f679545608ec tempest-ServerShowV257Test-724719980 tempest-ServerShowV257Test-724719980-project-member] [instance: 5cf06245-3fa1-4596-8260-7a82bc4a1193] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1482.107459] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3fee9975-0813-4582-a552-f679545608ec tempest-ServerShowV257Test-724719980 tempest-ServerShowV257Test-724719980-project-member] Lock "5cf06245-3fa1-4596-8260-7a82bc4a1193" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 221.453s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1482.118117] env[62277]: DEBUG nova.compute.manager [None req-6f620688-b0e0-4890-aa07-12e4e3b48735 tempest-ListServerFiltersTestJSON-1537632818 tempest-ListServerFiltersTestJSON-1537632818-project-member] [instance: 308e5c48-c452-4dbf-94b0-1eb12951e620] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1482.142335] env[62277]: DEBUG nova.compute.manager [None req-6f620688-b0e0-4890-aa07-12e4e3b48735 tempest-ListServerFiltersTestJSON-1537632818 tempest-ListServerFiltersTestJSON-1537632818-project-member] [instance: 308e5c48-c452-4dbf-94b0-1eb12951e620] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1482.169748] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6f620688-b0e0-4890-aa07-12e4e3b48735 tempest-ListServerFiltersTestJSON-1537632818 tempest-ListServerFiltersTestJSON-1537632818-project-member] Lock "308e5c48-c452-4dbf-94b0-1eb12951e620" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.400s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1482.182925] env[62277]: DEBUG nova.compute.manager [None req-cfaa7f1e-b1cf-452b-8390-d89b308ccd82 tempest-ListServerFiltersTestJSON-1537632818 tempest-ListServerFiltersTestJSON-1537632818-project-member] [instance: 39172747-1245-473d-9f18-87bae208b5b1] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1482.207931] env[62277]: DEBUG nova.compute.manager [None req-cfaa7f1e-b1cf-452b-8390-d89b308ccd82 tempest-ListServerFiltersTestJSON-1537632818 tempest-ListServerFiltersTestJSON-1537632818-project-member] [instance: 39172747-1245-473d-9f18-87bae208b5b1] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1482.232302] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cfaa7f1e-b1cf-452b-8390-d89b308ccd82 tempest-ListServerFiltersTestJSON-1537632818 tempest-ListServerFiltersTestJSON-1537632818-project-member] Lock "39172747-1245-473d-9f18-87bae208b5b1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.046s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1482.247954] env[62277]: DEBUG nova.compute.manager [None req-d087fe8a-ba45-4e7c-8fe4-fdab8a5b8226 tempest-ListServerFiltersTestJSON-1537632818 tempest-ListServerFiltersTestJSON-1537632818-project-member] [instance: 7ecfbbee-4955-4704-af62-ce8f5470cfbe] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1482.279909] env[62277]: DEBUG nova.compute.manager [None req-d087fe8a-ba45-4e7c-8fe4-fdab8a5b8226 tempest-ListServerFiltersTestJSON-1537632818 tempest-ListServerFiltersTestJSON-1537632818-project-member] [instance: 7ecfbbee-4955-4704-af62-ce8f5470cfbe] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1482.303984] env[62277]: DEBUG oslo_concurrency.lockutils [None req-d087fe8a-ba45-4e7c-8fe4-fdab8a5b8226 tempest-ListServerFiltersTestJSON-1537632818 tempest-ListServerFiltersTestJSON-1537632818-project-member] Lock "7ecfbbee-4955-4704-af62-ce8f5470cfbe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.684s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1482.315821] env[62277]: DEBUG nova.compute.manager [None req-c3e8c766-784b-47ca-8361-aea05dd9ff21 tempest-ServerExternalEventsTest-112581389 tempest-ServerExternalEventsTest-112581389-project-member] [instance: 4ca037fc-9a4e-413b-9b4e-2122f5a4fe18] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1482.346315] env[62277]: DEBUG nova.compute.manager [None req-c3e8c766-784b-47ca-8361-aea05dd9ff21 tempest-ServerExternalEventsTest-112581389 tempest-ServerExternalEventsTest-112581389-project-member] [instance: 4ca037fc-9a4e-413b-9b4e-2122f5a4fe18] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1482.378607] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c3e8c766-784b-47ca-8361-aea05dd9ff21 tempest-ServerExternalEventsTest-112581389 tempest-ServerExternalEventsTest-112581389-project-member] Lock "4ca037fc-9a4e-413b-9b4e-2122f5a4fe18" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.815s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1482.390752] env[62277]: DEBUG nova.compute.manager [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1482.458686] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1482.458940] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1482.460423] env[62277]: INFO nova.compute.claims [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1482.921773] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26658e7d-337e-4352-a66a-fc21e54a8067 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1482.930114] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b0afd83-2ffe-4867-84b3-f23e1a7a1e8f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1482.961193] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1fbbfe2-1bef-4e38-a303-f1cd443accb0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1482.969608] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce16e27d-b591-4dde-8dab-d86d29ba6f89 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1482.985053] env[62277]: DEBUG nova.compute.provider_tree [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1482.995167] env[62277]: DEBUG nova.scheduler.client.report [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1483.010064] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.551s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1483.010668] env[62277]: DEBUG nova.compute.manager [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1483.048617] env[62277]: DEBUG nova.compute.utils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1483.049905] env[62277]: DEBUG nova.compute.manager [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1483.050027] env[62277]: DEBUG nova.network.neutron [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1483.061446] env[62277]: DEBUG nova.compute.manager [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1483.129724] env[62277]: DEBUG nova.compute.manager [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1483.137161] env[62277]: DEBUG nova.policy [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b1df8436897c4b0fa01107670094fd01', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1d3f45b91a0c4d4489f760e938ff9900', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1483.157719] env[62277]: DEBUG nova.virt.hardware [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1483.157952] env[62277]: DEBUG nova.virt.hardware [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1483.158117] env[62277]: DEBUG nova.virt.hardware [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1483.158295] env[62277]: DEBUG nova.virt.hardware [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1483.158435] env[62277]: DEBUG nova.virt.hardware [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1483.158581] env[62277]: DEBUG nova.virt.hardware [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1483.158788] env[62277]: DEBUG nova.virt.hardware [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1483.158943] env[62277]: DEBUG nova.virt.hardware [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1483.159242] env[62277]: DEBUG nova.virt.hardware [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1483.159441] env[62277]: DEBUG nova.virt.hardware [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1483.159616] env[62277]: DEBUG nova.virt.hardware [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1483.160696] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6489395a-b732-4f5e-ab7a-cbf5e7b66590 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1483.170968] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c966945f-ba4a-4594-9592-06947d1a9caa {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1483.559084] env[62277]: DEBUG nova.network.neutron [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Successfully created port: 2c3854b5-396e-4aa4-a40a-75247a338dfc {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1484.303861] env[62277]: DEBUG nova.compute.manager [req-eb66ed4b-4341-4204-860b-edd6f91d20cd req-c8d56296-fc62-4498-b836-5f6ef5b91ac1 service nova] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Received event network-vif-plugged-2c3854b5-396e-4aa4-a40a-75247a338dfc {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1484.304106] env[62277]: DEBUG oslo_concurrency.lockutils [req-eb66ed4b-4341-4204-860b-edd6f91d20cd req-c8d56296-fc62-4498-b836-5f6ef5b91ac1 service nova] Acquiring lock "900160c8-a715-45a4-8709-b314fc3216d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1484.304319] env[62277]: DEBUG oslo_concurrency.lockutils [req-eb66ed4b-4341-4204-860b-edd6f91d20cd req-c8d56296-fc62-4498-b836-5f6ef5b91ac1 service nova] Lock "900160c8-a715-45a4-8709-b314fc3216d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1484.304482] env[62277]: DEBUG oslo_concurrency.lockutils [req-eb66ed4b-4341-4204-860b-edd6f91d20cd req-c8d56296-fc62-4498-b836-5f6ef5b91ac1 service nova] Lock "900160c8-a715-45a4-8709-b314fc3216d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1484.304643] env[62277]: DEBUG nova.compute.manager [req-eb66ed4b-4341-4204-860b-edd6f91d20cd req-c8d56296-fc62-4498-b836-5f6ef5b91ac1 service nova] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] No waiting events found dispatching network-vif-plugged-2c3854b5-396e-4aa4-a40a-75247a338dfc {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1484.304802] env[62277]: WARNING nova.compute.manager [req-eb66ed4b-4341-4204-860b-edd6f91d20cd req-c8d56296-fc62-4498-b836-5f6ef5b91ac1 service nova] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Received unexpected event network-vif-plugged-2c3854b5-396e-4aa4-a40a-75247a338dfc for instance with vm_state building and task_state spawning. [ 1484.364725] env[62277]: DEBUG nova.network.neutron [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Successfully updated port: 2c3854b5-396e-4aa4-a40a-75247a338dfc {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1484.375939] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Acquiring lock "refresh_cache-900160c8-a715-45a4-8709-b314fc3216d5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1484.376103] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Acquired lock "refresh_cache-900160c8-a715-45a4-8709-b314fc3216d5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1484.376252] env[62277]: DEBUG nova.network.neutron [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1484.442392] env[62277]: DEBUG nova.network.neutron [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1484.681526] env[62277]: DEBUG nova.network.neutron [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Updating instance_info_cache with network_info: [{"id": "2c3854b5-396e-4aa4-a40a-75247a338dfc", "address": "fa:16:3e:e1:a7:72", "network": {"id": "12809283-c661-4cd1-9296-8f9f453d0ec1", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1452491577-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1d3f45b91a0c4d4489f760e938ff9900", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2c3854b5-39", "ovs_interfaceid": "2c3854b5-396e-4aa4-a40a-75247a338dfc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1484.693130] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Releasing lock "refresh_cache-900160c8-a715-45a4-8709-b314fc3216d5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1484.693484] env[62277]: DEBUG nova.compute.manager [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Instance network_info: |[{"id": "2c3854b5-396e-4aa4-a40a-75247a338dfc", "address": "fa:16:3e:e1:a7:72", "network": {"id": "12809283-c661-4cd1-9296-8f9f453d0ec1", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1452491577-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1d3f45b91a0c4d4489f760e938ff9900", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2c3854b5-39", "ovs_interfaceid": "2c3854b5-396e-4aa4-a40a-75247a338dfc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1484.694389] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e1:a7:72', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'db00ec2e-3155-46b6-8170-082f7d86dbe7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2c3854b5-396e-4aa4-a40a-75247a338dfc', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1484.703822] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Creating folder: Project (1d3f45b91a0c4d4489f760e938ff9900). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1484.703822] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1f12e009-4f29-4a3a-9f84-84c0bcbbe186 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.716911] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Created folder: Project (1d3f45b91a0c4d4489f760e938ff9900) in parent group-v297781. [ 1484.717197] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Creating folder: Instances. Parent ref: group-v297851. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1484.717454] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5707372b-2489-4aca-a730-5011ed132ce9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.727114] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Created folder: Instances in parent group-v297851. [ 1484.727355] env[62277]: DEBUG oslo.service.loopingcall [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1484.727540] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1484.727750] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7d86569c-cc1e-45a3-b75d-602443a95204 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.749200] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1484.749200] env[62277]: value = "task-1405415" [ 1484.749200] env[62277]: _type = "Task" [ 1484.749200] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1484.757179] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405415, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1485.260777] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405415, 'name': CreateVM_Task, 'duration_secs': 0.32453} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1485.260957] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1485.261681] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1485.261846] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1485.262187] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1485.262445] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d531822d-922e-44d9-962c-6c4a88417833 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1485.267789] env[62277]: DEBUG oslo_vmware.api [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Waiting for the task: (returnval){ [ 1485.267789] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]524ee041-363e-7142-3773-0ef66e723748" [ 1485.267789] env[62277]: _type = "Task" [ 1485.267789] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1485.277624] env[62277]: DEBUG oslo_vmware.api [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]524ee041-363e-7142-3773-0ef66e723748, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1485.662459] env[62277]: DEBUG oslo_concurrency.lockutils [None req-d6bf19c4-020a-4aca-8044-ada8667f45e5 tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Acquiring lock "900160c8-a715-45a4-8709-b314fc3216d5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1485.779558] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1485.779808] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1485.780032] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1486.361501] env[62277]: DEBUG nova.compute.manager [req-d9dfc3a0-4d7f-494a-a113-4af26ad4eda5 req-1b63b3a5-9748-417c-88b2-830a76d62b6a service nova] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Received event network-changed-2c3854b5-396e-4aa4-a40a-75247a338dfc {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1486.361686] env[62277]: DEBUG nova.compute.manager [req-d9dfc3a0-4d7f-494a-a113-4af26ad4eda5 req-1b63b3a5-9748-417c-88b2-830a76d62b6a service nova] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Refreshing instance network info cache due to event network-changed-2c3854b5-396e-4aa4-a40a-75247a338dfc. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1486.361885] env[62277]: DEBUG oslo_concurrency.lockutils [req-d9dfc3a0-4d7f-494a-a113-4af26ad4eda5 req-1b63b3a5-9748-417c-88b2-830a76d62b6a service nova] Acquiring lock "refresh_cache-900160c8-a715-45a4-8709-b314fc3216d5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1486.366068] env[62277]: DEBUG oslo_concurrency.lockutils [req-d9dfc3a0-4d7f-494a-a113-4af26ad4eda5 req-1b63b3a5-9748-417c-88b2-830a76d62b6a service nova] Acquired lock "refresh_cache-900160c8-a715-45a4-8709-b314fc3216d5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1486.366343] env[62277]: DEBUG nova.network.neutron [req-d9dfc3a0-4d7f-494a-a113-4af26ad4eda5 req-1b63b3a5-9748-417c-88b2-830a76d62b6a service nova] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Refreshing network info cache for port 2c3854b5-396e-4aa4-a40a-75247a338dfc {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1487.098123] env[62277]: DEBUG nova.network.neutron [req-d9dfc3a0-4d7f-494a-a113-4af26ad4eda5 req-1b63b3a5-9748-417c-88b2-830a76d62b6a service nova] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Updated VIF entry in instance network info cache for port 2c3854b5-396e-4aa4-a40a-75247a338dfc. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1487.098496] env[62277]: DEBUG nova.network.neutron [req-d9dfc3a0-4d7f-494a-a113-4af26ad4eda5 req-1b63b3a5-9748-417c-88b2-830a76d62b6a service nova] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Updating instance_info_cache with network_info: [{"id": "2c3854b5-396e-4aa4-a40a-75247a338dfc", "address": "fa:16:3e:e1:a7:72", "network": {"id": "12809283-c661-4cd1-9296-8f9f453d0ec1", "bridge": "br-int", "label": "tempest-InstanceActionsTestJSON-1452491577-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1d3f45b91a0c4d4489f760e938ff9900", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "db00ec2e-3155-46b6-8170-082f7d86dbe7", "external-id": "nsx-vlan-transportzone-332", "segmentation_id": 332, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2c3854b5-39", "ovs_interfaceid": "2c3854b5-396e-4aa4-a40a-75247a338dfc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1487.107954] env[62277]: DEBUG oslo_concurrency.lockutils [req-d9dfc3a0-4d7f-494a-a113-4af26ad4eda5 req-1b63b3a5-9748-417c-88b2-830a76d62b6a service nova] Releasing lock "refresh_cache-900160c8-a715-45a4-8709-b314fc3216d5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1499.158640] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "42005809-1926-44b2-8ef6-3b6cb28a4020" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1499.158970] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "42005809-1926-44b2-8ef6-3b6cb28a4020" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1514.094101] env[62277]: DEBUG oslo_concurrency.lockutils [None req-d7e34775-9706-428a-8656-4894ed1026cf tempest-ServerAddressesTestJSON-868450768 tempest-ServerAddressesTestJSON-868450768-project-member] Acquiring lock "1742e8d0-3cf2-4a78-99e4-652f9664df96" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1514.094446] env[62277]: DEBUG oslo_concurrency.lockutils [None req-d7e34775-9706-428a-8656-4894ed1026cf tempest-ServerAddressesTestJSON-868450768 tempest-ServerAddressesTestJSON-868450768-project-member] Lock "1742e8d0-3cf2-4a78-99e4-652f9664df96" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1521.169478] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1521.169851] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 1521.169851] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 1521.192204] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1521.192375] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1521.192434] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1521.192562] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1521.192763] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1521.192905] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1521.193030] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1521.193149] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1521.193274] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1521.193389] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1521.193504] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 1521.194019] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1522.168841] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1522.169114] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1522.169265] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Cleaning up deleted instances {{(pid=62277) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11196}} [ 1522.183747] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] There are 1 instances to clean {{(pid=62277) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11205}} [ 1522.184047] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 154fa64c-55d4-4b72-8af9-39e72fd5df5f] Instance has had 0 of 5 cleanup attempts {{(pid=62277) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11209}} [ 1522.220265] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1522.220416] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Cleaning up deleted instances with incomplete migration {{(pid=62277) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11234}} [ 1523.222605] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1523.223410] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1525.168698] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1525.168984] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1525.169223] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1525.180590] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1525.180789] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1525.180950] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1525.181133] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1525.182532] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8eb2c57c-8f89-46ca-a394-14db799f4876 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.191437] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8152576f-e49a-4e5b-810b-d56fbb98d0c3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.206798] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d5eca69-4946-4c41-ad24-0f981dd13e92 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.212756] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48d86f68-7f9c-4222-9519-7e34a83035ff {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1525.243377] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181372MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1525.243540] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1525.243742] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1525.374307] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 23bc5a48-3e96-4897-bf28-ad14a0bdde62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1525.374472] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 346748bd-b4e8-4e93-b71d-66c90a45e372 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1525.374601] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 930ff058-ab48-4c8a-8f5e-4820a3b12d50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1525.374725] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1525.374857] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 1e8429b2-7149-4832-8590-e0ebd8501176 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1525.374976] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 21bd4623-2b46-43c4-859f-c4d3bf261e1f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1525.375105] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6d759045-e1fc-43ea-a882-1ead769b6d29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1525.375222] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 32bed248-06d5-47a1-b281-47921d99dbf6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1525.375340] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8d00162c-7379-48b6-841b-f802db2582db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1525.375453] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 900160c8-a715-45a4-8709-b314fc3216d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1525.387322] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 63267d5c-d004-41c1-866a-75b9e37521b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1525.397660] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 13959890-87a1-45ba-98de-621373e265e7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1525.407217] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance b2ad5654-28f5-43b2-acb8-a7eb01f70b55 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1525.416604] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance ad6f7885-4dbd-4306-baf2-b75cc43276d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1525.426825] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 7a91e375-b4c6-4e05-a631-dae60926de1a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1525.438540] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 26dc82d6-bc3c-4b53-8fff-64578be0d404 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1525.449399] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 7e8d3614-e021-4509-87c7-1c4d68c3e570 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1525.461045] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c3af2c56-c745-43e4-9f1b-77937a1d2559 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1525.470970] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance b8467237-7a59-4302-9dd3-0cbdfc813753 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1525.501155] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance a7cc7e45-8567-4699-af83-624b1c7c5c64 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1525.512808] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 24ee6c71-7267-4fe2-8ac4-84bf1d00c024 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1525.523317] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance f468d0c6-35ed-4f8d-a3dc-aea9462aa7bb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1525.533647] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8167915d-ed3a-44b7-8eff-d585e7f6ffbf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1525.543932] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9fedbb74-ae57-4cb8-8496-2ff9c703b46e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1525.554572] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 42005809-1926-44b2-8ef6-3b6cb28a4020 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1525.566606] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 1742e8d0-3cf2-4a78-99e4-652f9664df96 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1525.566849] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1525.566995] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1525.583991] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Refreshing inventories for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1525.598160] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Updating ProviderTree inventory for provider 75e125ea-a599-4b65-b9cd-6ea881735292 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1525.598354] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Updating inventory in ProviderTree for provider 75e125ea-a599-4b65-b9cd-6ea881735292 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1525.609260] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Refreshing aggregate associations for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292, aggregates: None {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1525.627449] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Refreshing trait associations for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1526.025801] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-839fc82a-0a6d-4812-ac3b-a80e568be068 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1526.033466] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f1fe6d1-ec4a-4384-b2fb-0ae1c886d0a7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1526.063123] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b10f5aeb-4077-4032-b96e-91360a604610 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1526.071393] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9737974-c734-4291-b659-e2146d8de40d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1526.085590] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1526.095393] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1526.108955] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1526.109119] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.865s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1527.075366] env[62277]: WARNING oslo_vmware.rw_handles [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1527.075366] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1527.075366] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1527.075366] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1527.075366] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1527.075366] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1527.075366] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1527.075366] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1527.075366] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1527.075366] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1527.075366] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1527.075366] env[62277]: ERROR oslo_vmware.rw_handles [ 1527.076152] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/9fe2aab3-b335-4b7a-8f9c-199fc9846516/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1527.078353] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1527.078654] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Copying Virtual Disk [datastore2] vmware_temp/9fe2aab3-b335-4b7a-8f9c-199fc9846516/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/9fe2aab3-b335-4b7a-8f9c-199fc9846516/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1527.078952] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-81cf0601-61c7-4d6d-9db9-0d909da27f80 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1527.087300] env[62277]: DEBUG oslo_vmware.api [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Waiting for the task: (returnval){ [ 1527.087300] env[62277]: value = "task-1405416" [ 1527.087300] env[62277]: _type = "Task" [ 1527.087300] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1527.095336] env[62277]: DEBUG oslo_vmware.api [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Task: {'id': task-1405416, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1527.597972] env[62277]: DEBUG oslo_vmware.exceptions [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1527.598275] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1527.599093] env[62277]: ERROR nova.compute.manager [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1527.599093] env[62277]: Faults: ['InvalidArgument'] [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Traceback (most recent call last): [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] yield resources [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] self.driver.spawn(context, instance, image_meta, [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] self._fetch_image_if_missing(context, vi) [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] image_cache(vi, tmp_image_ds_loc) [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] vm_util.copy_virtual_disk( [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] session._wait_for_task(vmdk_copy_task) [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] return self.wait_for_task(task_ref) [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] return evt.wait() [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] result = hub.switch() [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] return self.greenlet.switch() [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] self.f(*self.args, **self.kw) [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] raise exceptions.translate_fault(task_info.error) [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Faults: ['InvalidArgument'] [ 1527.599093] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] [ 1527.599093] env[62277]: INFO nova.compute.manager [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Terminating instance [ 1527.600650] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1527.600843] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1527.601597] env[62277]: DEBUG nova.compute.manager [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1527.601681] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1527.601885] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-dd46410e-fa8c-43c9-baf3-3c38fb389e2e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1527.604464] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3deec40c-434f-4ca6-aa33-60a3f8166109 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1527.611336] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1527.611560] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-595b979a-fd54-40f9-9dad-0742ca146324 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1527.613776] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1527.613946] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1527.614955] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-68b9a759-0732-440f-97f5-a11172e0368b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1527.619920] env[62277]: DEBUG oslo_vmware.api [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Waiting for the task: (returnval){ [ 1527.619920] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52d8b0fb-ba69-114c-1019-bca10e506992" [ 1527.619920] env[62277]: _type = "Task" [ 1527.619920] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1527.627399] env[62277]: DEBUG oslo_vmware.api [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52d8b0fb-ba69-114c-1019-bca10e506992, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1527.692169] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1527.692399] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1527.692575] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Deleting the datastore file [datastore2] 23bc5a48-3e96-4897-bf28-ad14a0bdde62 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1527.693823] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f20219a9-8f18-4cba-899d-feab315d8655 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1527.700769] env[62277]: DEBUG oslo_vmware.api [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Waiting for the task: (returnval){ [ 1527.700769] env[62277]: value = "task-1405418" [ 1527.700769] env[62277]: _type = "Task" [ 1527.700769] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1527.707934] env[62277]: DEBUG oslo_vmware.api [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Task: {'id': task-1405418, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1528.129466] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1528.129737] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Creating directory with path [datastore2] vmware_temp/1544e1c7-d3ee-4635-9311-bb7a5cb52173/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1528.129944] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-076283a1-3078-4820-a741-2c1e2935c690 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.141294] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Created directory with path [datastore2] vmware_temp/1544e1c7-d3ee-4635-9311-bb7a5cb52173/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1528.141486] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Fetch image to [datastore2] vmware_temp/1544e1c7-d3ee-4635-9311-bb7a5cb52173/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1528.141650] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/1544e1c7-d3ee-4635-9311-bb7a5cb52173/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1528.142499] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3600f1bd-c11c-4e67-b3c5-5615a7d95dc0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.149470] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9eb518b9-b42a-4ef7-aedb-afa9a8f7ceb7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.159671] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5aad1a7e-705e-4e1d-ac02-02a99c48166b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.191752] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8c09553-d6af-45a6-8baf-c1cd0a220b59 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.197488] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-aa6eccb6-0c7e-448c-a5ca-c3987a1f1e1b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.207810] env[62277]: DEBUG oslo_vmware.api [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Task: {'id': task-1405418, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063621} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1528.208052] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1528.208277] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1528.208398] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1528.208596] env[62277]: INFO nova.compute.manager [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1528.210901] env[62277]: DEBUG nova.compute.claims [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1528.211120] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1528.212316] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1528.218069] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1528.298064] env[62277]: DEBUG oslo_vmware.rw_handles [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1544e1c7-d3ee-4635-9311-bb7a5cb52173/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1528.363045] env[62277]: DEBUG oslo_vmware.rw_handles [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1528.363319] env[62277]: DEBUG oslo_vmware.rw_handles [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1544e1c7-d3ee-4635-9311-bb7a5cb52173/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1528.656260] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-902b75fb-1b0e-4b27-b3a3-62961595a995 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.665290] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7fa49c8-dab3-4339-b1cc-7295c35962f0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.701548] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c227ed4-b25a-4c49-bb2b-babbfae23042 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.709300] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb53a570-677b-4ca8-a626-19f008e307ac {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1528.723784] env[62277]: DEBUG nova.compute.provider_tree [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1528.735976] env[62277]: DEBUG nova.scheduler.client.report [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1528.758162] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.547s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1528.758704] env[62277]: ERROR nova.compute.manager [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1528.758704] env[62277]: Faults: ['InvalidArgument'] [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Traceback (most recent call last): [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] self.driver.spawn(context, instance, image_meta, [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] self._fetch_image_if_missing(context, vi) [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] image_cache(vi, tmp_image_ds_loc) [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] vm_util.copy_virtual_disk( [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] session._wait_for_task(vmdk_copy_task) [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] return self.wait_for_task(task_ref) [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] return evt.wait() [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] result = hub.switch() [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] return self.greenlet.switch() [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] self.f(*self.args, **self.kw) [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] raise exceptions.translate_fault(task_info.error) [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Faults: ['InvalidArgument'] [ 1528.758704] env[62277]: ERROR nova.compute.manager [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] [ 1528.759731] env[62277]: DEBUG nova.compute.utils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1528.760914] env[62277]: DEBUG nova.compute.manager [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Build of instance 23bc5a48-3e96-4897-bf28-ad14a0bdde62 was re-scheduled: A specified parameter was not correct: fileType [ 1528.760914] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1528.761289] env[62277]: DEBUG nova.compute.manager [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1528.761461] env[62277]: DEBUG nova.compute.manager [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1528.761629] env[62277]: DEBUG nova.compute.manager [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1528.761788] env[62277]: DEBUG nova.network.neutron [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1529.108565] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1529.108751] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 1529.123722] env[62277]: DEBUG nova.network.neutron [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1529.143692] env[62277]: INFO nova.compute.manager [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Took 0.38 seconds to deallocate network for instance. [ 1529.257103] env[62277]: INFO nova.scheduler.client.report [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Deleted allocations for instance 23bc5a48-3e96-4897-bf28-ad14a0bdde62 [ 1529.288070] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fbe65225-1f8f-4f74-b42a-27fe48aad375 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Lock "23bc5a48-3e96-4897-bf28-ad14a0bdde62" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 515.282s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1529.289241] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b01d96ad-e621-46ce-b976-851d54e12df8 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Lock "23bc5a48-3e96-4897-bf28-ad14a0bdde62" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 316.824s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1529.289463] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b01d96ad-e621-46ce-b976-851d54e12df8 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Acquiring lock "23bc5a48-3e96-4897-bf28-ad14a0bdde62-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1529.289663] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b01d96ad-e621-46ce-b976-851d54e12df8 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Lock "23bc5a48-3e96-4897-bf28-ad14a0bdde62-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1529.289824] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b01d96ad-e621-46ce-b976-851d54e12df8 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Lock "23bc5a48-3e96-4897-bf28-ad14a0bdde62-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1529.292197] env[62277]: INFO nova.compute.manager [None req-b01d96ad-e621-46ce-b976-851d54e12df8 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Terminating instance [ 1529.294063] env[62277]: DEBUG nova.compute.manager [None req-b01d96ad-e621-46ce-b976-851d54e12df8 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1529.294308] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b01d96ad-e621-46ce-b976-851d54e12df8 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1529.295142] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-525d9cdd-7833-4209-b1ac-4eeb3e8d92a3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1529.305142] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fc0bb5c-0746-48b4-9389-855c49887a41 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1529.319513] env[62277]: DEBUG nova.compute.manager [None req-2b0870bb-0544-48f2-bcd2-316ff7a5acd3 tempest-AttachInterfacesV270Test-44240375 tempest-AttachInterfacesV270Test-44240375-project-member] [instance: 0901d48b-bc88-461b-8503-eb6b51c39148] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1529.345121] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-b01d96ad-e621-46ce-b976-851d54e12df8 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 23bc5a48-3e96-4897-bf28-ad14a0bdde62 could not be found. [ 1529.345121] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b01d96ad-e621-46ce-b976-851d54e12df8 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1529.345121] env[62277]: INFO nova.compute.manager [None req-b01d96ad-e621-46ce-b976-851d54e12df8 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1529.345121] env[62277]: DEBUG oslo.service.loopingcall [None req-b01d96ad-e621-46ce-b976-851d54e12df8 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1529.345121] env[62277]: DEBUG nova.compute.manager [-] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1529.345121] env[62277]: DEBUG nova.network.neutron [-] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1529.352102] env[62277]: DEBUG nova.compute.manager [None req-2b0870bb-0544-48f2-bcd2-316ff7a5acd3 tempest-AttachInterfacesV270Test-44240375 tempest-AttachInterfacesV270Test-44240375-project-member] [instance: 0901d48b-bc88-461b-8503-eb6b51c39148] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1529.375571] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2b0870bb-0544-48f2-bcd2-316ff7a5acd3 tempest-AttachInterfacesV270Test-44240375 tempest-AttachInterfacesV270Test-44240375-project-member] Lock "0901d48b-bc88-461b-8503-eb6b51c39148" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.293s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1529.377030] env[62277]: DEBUG nova.network.neutron [-] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1529.386186] env[62277]: INFO nova.compute.manager [-] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] Took 0.04 seconds to deallocate network for instance. [ 1529.390653] env[62277]: DEBUG nova.compute.manager [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1529.448984] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1529.449241] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1529.451842] env[62277]: INFO nova.compute.claims [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1529.558485] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b01d96ad-e621-46ce-b976-851d54e12df8 tempest-VolumesAssistedSnapshotsTest-658300929 tempest-VolumesAssistedSnapshotsTest-658300929-project-member] Lock "23bc5a48-3e96-4897-bf28-ad14a0bdde62" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.269s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1529.559407] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "23bc5a48-3e96-4897-bf28-ad14a0bdde62" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 281.421s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1529.559595] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 23bc5a48-3e96-4897-bf28-ad14a0bdde62] During sync_power_state the instance has a pending task (deleting). Skip. [ 1529.559769] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "23bc5a48-3e96-4897-bf28-ad14a0bdde62" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1529.944437] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7098a70a-176a-48e2-977b-a903c4cbf335 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1529.954245] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bb2469b-0b0d-41c4-8c39-a43196ee0897 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1529.989746] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0f0af26-263c-46c4-af6e-a1eacc9bf729 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1529.999178] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04d0d787-5b24-4d5b-8b95-82eddaab20a3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1530.013169] env[62277]: DEBUG nova.compute.provider_tree [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1530.022966] env[62277]: DEBUG nova.scheduler.client.report [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1530.040678] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.591s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1530.041028] env[62277]: DEBUG nova.compute.manager [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1530.083769] env[62277]: DEBUG nova.compute.utils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1530.089106] env[62277]: DEBUG nova.compute.manager [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1530.089336] env[62277]: DEBUG nova.network.neutron [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1530.095638] env[62277]: DEBUG nova.compute.manager [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1530.147626] env[62277]: DEBUG nova.policy [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b75599aca6d94b71a46ff1debe85f0cb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e130e1d3248844ff93c56c2cdc19f7af', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1530.171110] env[62277]: DEBUG nova.compute.manager [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1530.208506] env[62277]: DEBUG nova.virt.hardware [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1530.208782] env[62277]: DEBUG nova.virt.hardware [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1530.208984] env[62277]: DEBUG nova.virt.hardware [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1530.209090] env[62277]: DEBUG nova.virt.hardware [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1530.209246] env[62277]: DEBUG nova.virt.hardware [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1530.209394] env[62277]: DEBUG nova.virt.hardware [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1530.209598] env[62277]: DEBUG nova.virt.hardware [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1530.209751] env[62277]: DEBUG nova.virt.hardware [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1530.209910] env[62277]: DEBUG nova.virt.hardware [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1530.210085] env[62277]: DEBUG nova.virt.hardware [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1530.210255] env[62277]: DEBUG nova.virt.hardware [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1530.211458] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86dbc8b0-bcd2-4db3-be18-d3300c8f94ba {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1530.219801] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8723167-d364-4873-ac48-5f2430f6ecbf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1530.547384] env[62277]: DEBUG nova.network.neutron [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Successfully created port: 83156fc8-8c4f-4fa5-8e9c-bfb5991cc94f {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1531.533911] env[62277]: DEBUG nova.compute.manager [req-aeb256ee-75e8-4b91-bf32-fa56b5f10615 req-2b943697-437d-42f7-b709-e14baf9b0687 service nova] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Received event network-vif-plugged-83156fc8-8c4f-4fa5-8e9c-bfb5991cc94f {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1531.534198] env[62277]: DEBUG oslo_concurrency.lockutils [req-aeb256ee-75e8-4b91-bf32-fa56b5f10615 req-2b943697-437d-42f7-b709-e14baf9b0687 service nova] Acquiring lock "63267d5c-d004-41c1-866a-75b9e37521b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1531.534358] env[62277]: DEBUG oslo_concurrency.lockutils [req-aeb256ee-75e8-4b91-bf32-fa56b5f10615 req-2b943697-437d-42f7-b709-e14baf9b0687 service nova] Lock "63267d5c-d004-41c1-866a-75b9e37521b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1531.534522] env[62277]: DEBUG oslo_concurrency.lockutils [req-aeb256ee-75e8-4b91-bf32-fa56b5f10615 req-2b943697-437d-42f7-b709-e14baf9b0687 service nova] Lock "63267d5c-d004-41c1-866a-75b9e37521b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1531.534685] env[62277]: DEBUG nova.compute.manager [req-aeb256ee-75e8-4b91-bf32-fa56b5f10615 req-2b943697-437d-42f7-b709-e14baf9b0687 service nova] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] No waiting events found dispatching network-vif-plugged-83156fc8-8c4f-4fa5-8e9c-bfb5991cc94f {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1531.534845] env[62277]: WARNING nova.compute.manager [req-aeb256ee-75e8-4b91-bf32-fa56b5f10615 req-2b943697-437d-42f7-b709-e14baf9b0687 service nova] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Received unexpected event network-vif-plugged-83156fc8-8c4f-4fa5-8e9c-bfb5991cc94f for instance with vm_state building and task_state spawning. [ 1531.555617] env[62277]: DEBUG nova.network.neutron [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Successfully updated port: 83156fc8-8c4f-4fa5-8e9c-bfb5991cc94f {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1531.570154] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Acquiring lock "refresh_cache-63267d5c-d004-41c1-866a-75b9e37521b7" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1531.570306] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Acquired lock "refresh_cache-63267d5c-d004-41c1-866a-75b9e37521b7" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1531.570455] env[62277]: DEBUG nova.network.neutron [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1531.627087] env[62277]: DEBUG nova.network.neutron [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1531.876032] env[62277]: DEBUG nova.network.neutron [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Updating instance_info_cache with network_info: [{"id": "83156fc8-8c4f-4fa5-8e9c-bfb5991cc94f", "address": "fa:16:3e:cd:87:4d", "network": {"id": "c52bb975-a344-4564-b4c7-9e31ab1a2949", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1450631951-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e130e1d3248844ff93c56c2cdc19f7af", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c4810e0b-c5e1-43ca-8d35-de29f7ebe7b0", "external-id": "nsx-vlan-transportzone-60", "segmentation_id": 60, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap83156fc8-8c", "ovs_interfaceid": "83156fc8-8c4f-4fa5-8e9c-bfb5991cc94f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1531.892907] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Releasing lock "refresh_cache-63267d5c-d004-41c1-866a-75b9e37521b7" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1531.893221] env[62277]: DEBUG nova.compute.manager [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Instance network_info: |[{"id": "83156fc8-8c4f-4fa5-8e9c-bfb5991cc94f", "address": "fa:16:3e:cd:87:4d", "network": {"id": "c52bb975-a344-4564-b4c7-9e31ab1a2949", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1450631951-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e130e1d3248844ff93c56c2cdc19f7af", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c4810e0b-c5e1-43ca-8d35-de29f7ebe7b0", "external-id": "nsx-vlan-transportzone-60", "segmentation_id": 60, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap83156fc8-8c", "ovs_interfaceid": "83156fc8-8c4f-4fa5-8e9c-bfb5991cc94f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1531.895144] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:cd:87:4d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c4810e0b-c5e1-43ca-8d35-de29f7ebe7b0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '83156fc8-8c4f-4fa5-8e9c-bfb5991cc94f', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1531.904271] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Creating folder: Project (e130e1d3248844ff93c56c2cdc19f7af). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1531.904891] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d74f1405-ec7f-4a8d-b672-27f58e384c7c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1531.920866] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Created folder: Project (e130e1d3248844ff93c56c2cdc19f7af) in parent group-v297781. [ 1531.920959] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Creating folder: Instances. Parent ref: group-v297854. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1531.921497] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7f067860-25d5-46e8-9782-69e1a14497e0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1531.930263] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Created folder: Instances in parent group-v297854. [ 1531.930490] env[62277]: DEBUG oslo.service.loopingcall [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1531.930700] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1531.930860] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-513a6182-22db-406f-96c0-63acf6a1329e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1531.953800] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1531.953800] env[62277]: value = "task-1405421" [ 1531.953800] env[62277]: _type = "Task" [ 1531.953800] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1531.960259] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405421, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1532.101552] env[62277]: DEBUG oslo_concurrency.lockutils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquiring lock "b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1532.102273] env[62277]: DEBUG oslo_concurrency.lockutils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1532.168396] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1532.464356] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405421, 'name': CreateVM_Task} progress is 99%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1532.598612] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b859558a-63e9-4c03-88ff-71a69ecfbcb8 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Acquiring lock "63267d5c-d004-41c1-866a-75b9e37521b7" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1532.646624] env[62277]: DEBUG oslo_concurrency.lockutils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquiring lock "35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1532.646721] env[62277]: DEBUG oslo_concurrency.lockutils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1532.964356] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405421, 'name': CreateVM_Task} progress is 99%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1533.464404] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405421, 'name': CreateVM_Task, 'duration_secs': 1.293104} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1533.464623] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1533.465511] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1533.465566] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1533.465876] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1533.466143] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-626964f7-37e1-41db-bd96-e6be1e81385f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1533.470586] env[62277]: DEBUG oslo_vmware.api [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Waiting for the task: (returnval){ [ 1533.470586] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52c528ef-ede4-ebc6-5272-2ee9ce17eb09" [ 1533.470586] env[62277]: _type = "Task" [ 1533.470586] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1533.478064] env[62277]: DEBUG oslo_vmware.api [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52c528ef-ede4-ebc6-5272-2ee9ce17eb09, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1533.584962] env[62277]: DEBUG nova.compute.manager [req-f2662140-22f5-4c9d-b901-593f3954e9c7 req-7deddfbb-2d1c-49dd-a940-97f8c38a55a8 service nova] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Received event network-changed-83156fc8-8c4f-4fa5-8e9c-bfb5991cc94f {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1533.585337] env[62277]: DEBUG nova.compute.manager [req-f2662140-22f5-4c9d-b901-593f3954e9c7 req-7deddfbb-2d1c-49dd-a940-97f8c38a55a8 service nova] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Refreshing instance network info cache due to event network-changed-83156fc8-8c4f-4fa5-8e9c-bfb5991cc94f. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1533.585657] env[62277]: DEBUG oslo_concurrency.lockutils [req-f2662140-22f5-4c9d-b901-593f3954e9c7 req-7deddfbb-2d1c-49dd-a940-97f8c38a55a8 service nova] Acquiring lock "refresh_cache-63267d5c-d004-41c1-866a-75b9e37521b7" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1533.585894] env[62277]: DEBUG oslo_concurrency.lockutils [req-f2662140-22f5-4c9d-b901-593f3954e9c7 req-7deddfbb-2d1c-49dd-a940-97f8c38a55a8 service nova] Acquired lock "refresh_cache-63267d5c-d004-41c1-866a-75b9e37521b7" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1533.586161] env[62277]: DEBUG nova.network.neutron [req-f2662140-22f5-4c9d-b901-593f3954e9c7 req-7deddfbb-2d1c-49dd-a940-97f8c38a55a8 service nova] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Refreshing network info cache for port 83156fc8-8c4f-4fa5-8e9c-bfb5991cc94f {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1533.876794] env[62277]: DEBUG nova.network.neutron [req-f2662140-22f5-4c9d-b901-593f3954e9c7 req-7deddfbb-2d1c-49dd-a940-97f8c38a55a8 service nova] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Updated VIF entry in instance network info cache for port 83156fc8-8c4f-4fa5-8e9c-bfb5991cc94f. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1533.877150] env[62277]: DEBUG nova.network.neutron [req-f2662140-22f5-4c9d-b901-593f3954e9c7 req-7deddfbb-2d1c-49dd-a940-97f8c38a55a8 service nova] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Updating instance_info_cache with network_info: [{"id": "83156fc8-8c4f-4fa5-8e9c-bfb5991cc94f", "address": "fa:16:3e:cd:87:4d", "network": {"id": "c52bb975-a344-4564-b4c7-9e31ab1a2949", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1450631951-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e130e1d3248844ff93c56c2cdc19f7af", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c4810e0b-c5e1-43ca-8d35-de29f7ebe7b0", "external-id": "nsx-vlan-transportzone-60", "segmentation_id": 60, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap83156fc8-8c", "ovs_interfaceid": "83156fc8-8c4f-4fa5-8e9c-bfb5991cc94f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1533.886108] env[62277]: DEBUG oslo_concurrency.lockutils [req-f2662140-22f5-4c9d-b901-593f3954e9c7 req-7deddfbb-2d1c-49dd-a940-97f8c38a55a8 service nova] Releasing lock "refresh_cache-63267d5c-d004-41c1-866a-75b9e37521b7" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1533.981765] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1533.982018] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1533.982231] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1573.443877] env[62277]: WARNING oslo_vmware.rw_handles [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1573.443877] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1573.443877] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1573.443877] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1573.443877] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1573.443877] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1573.443877] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1573.443877] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1573.443877] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1573.443877] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1573.443877] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1573.443877] env[62277]: ERROR oslo_vmware.rw_handles [ 1573.444675] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/1544e1c7-d3ee-4635-9311-bb7a5cb52173/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1573.446334] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1573.446622] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Copying Virtual Disk [datastore2] vmware_temp/1544e1c7-d3ee-4635-9311-bb7a5cb52173/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/1544e1c7-d3ee-4635-9311-bb7a5cb52173/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1573.446926] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a71194a7-d30e-4257-9b76-66a381662f07 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1573.454673] env[62277]: DEBUG oslo_vmware.api [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Waiting for the task: (returnval){ [ 1573.454673] env[62277]: value = "task-1405422" [ 1573.454673] env[62277]: _type = "Task" [ 1573.454673] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1573.462688] env[62277]: DEBUG oslo_vmware.api [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Task: {'id': task-1405422, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1573.965453] env[62277]: DEBUG oslo_vmware.exceptions [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1573.965782] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1573.966343] env[62277]: ERROR nova.compute.manager [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1573.966343] env[62277]: Faults: ['InvalidArgument'] [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Traceback (most recent call last): [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] yield resources [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] self.driver.spawn(context, instance, image_meta, [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] self._fetch_image_if_missing(context, vi) [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] image_cache(vi, tmp_image_ds_loc) [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] vm_util.copy_virtual_disk( [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] session._wait_for_task(vmdk_copy_task) [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] return self.wait_for_task(task_ref) [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] return evt.wait() [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] result = hub.switch() [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] return self.greenlet.switch() [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] self.f(*self.args, **self.kw) [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] raise exceptions.translate_fault(task_info.error) [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Faults: ['InvalidArgument'] [ 1573.966343] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] [ 1573.967555] env[62277]: INFO nova.compute.manager [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Terminating instance [ 1573.968526] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1573.968742] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1573.968978] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6e30868c-df2d-4895-aa77-1871140e50f1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1573.972576] env[62277]: DEBUG nova.compute.manager [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1573.972762] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1573.973517] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed7d73a6-7f6e-4387-a79e-bff9d6d92e9a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1573.977395] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1573.977628] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1573.978643] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2c264b19-1ad1-45c7-a71f-0b96823f21a2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1573.982835] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1573.983323] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d6be8e9a-34b3-4b92-aad3-3c3b5c6cd7c4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1573.985959] env[62277]: DEBUG oslo_vmware.api [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Waiting for the task: (returnval){ [ 1573.985959] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]522dc4e4-c926-a0ea-a3a8-1b375bc90a74" [ 1573.985959] env[62277]: _type = "Task" [ 1573.985959] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1573.993891] env[62277]: DEBUG oslo_vmware.api [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]522dc4e4-c926-a0ea-a3a8-1b375bc90a74, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1574.057314] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1574.057521] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1574.057886] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Deleting the datastore file [datastore2] 346748bd-b4e8-4e93-b71d-66c90a45e372 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1574.058206] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fc5afee1-b855-4c55-b551-8ef653e5bde4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1574.065551] env[62277]: DEBUG oslo_vmware.api [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Waiting for the task: (returnval){ [ 1574.065551] env[62277]: value = "task-1405424" [ 1574.065551] env[62277]: _type = "Task" [ 1574.065551] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1574.075074] env[62277]: DEBUG oslo_vmware.api [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Task: {'id': task-1405424, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1574.496341] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1574.496675] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Creating directory with path [datastore2] vmware_temp/8d31ac27-aae5-4248-9d52-2a6330986f2b/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1574.496810] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1cb02288-a0bd-470a-afae-910fd3120abd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1574.507771] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Created directory with path [datastore2] vmware_temp/8d31ac27-aae5-4248-9d52-2a6330986f2b/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1574.507956] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Fetch image to [datastore2] vmware_temp/8d31ac27-aae5-4248-9d52-2a6330986f2b/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1574.508137] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/8d31ac27-aae5-4248-9d52-2a6330986f2b/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1574.508833] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93e028d2-e5b5-489c-bdab-e89145ac66af {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1574.515076] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e3502e2-de43-410e-bb5a-d0da8222ee1c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1574.523834] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc59a5b9-f604-453a-abda-602e98c3ebd5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1574.553433] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8ff4593-b3e6-46ed-96ad-6ddc51424eb5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1574.559383] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b69bb716-c2d7-48ca-8113-a637ac506892 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1574.575042] env[62277]: DEBUG oslo_vmware.api [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Task: {'id': task-1405424, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078434} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1574.575042] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1574.575042] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1574.575042] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1574.575042] env[62277]: INFO nova.compute.manager [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1574.577061] env[62277]: DEBUG nova.compute.claims [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1574.577234] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1574.577453] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1574.583010] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1574.641175] env[62277]: DEBUG oslo_vmware.rw_handles [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8d31ac27-aae5-4248-9d52-2a6330986f2b/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1574.700608] env[62277]: DEBUG oslo_vmware.rw_handles [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1574.700806] env[62277]: DEBUG oslo_vmware.rw_handles [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8d31ac27-aae5-4248-9d52-2a6330986f2b/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1574.985521] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e457edd-339c-48b5-900b-f3ed1320e524 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1574.992824] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e28b241d-adbf-47be-bc68-684cc803624f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1575.023381] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-079ec508-b307-4012-a194-40a82369cfa9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1575.030640] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6dc8e21-0628-43e2-b513-095b7316a201 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1575.043674] env[62277]: DEBUG nova.compute.provider_tree [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1575.053402] env[62277]: DEBUG nova.scheduler.client.report [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1575.069624] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.492s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1575.070158] env[62277]: ERROR nova.compute.manager [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1575.070158] env[62277]: Faults: ['InvalidArgument'] [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Traceback (most recent call last): [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] self.driver.spawn(context, instance, image_meta, [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] self._fetch_image_if_missing(context, vi) [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] image_cache(vi, tmp_image_ds_loc) [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] vm_util.copy_virtual_disk( [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] session._wait_for_task(vmdk_copy_task) [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] return self.wait_for_task(task_ref) [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] return evt.wait() [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] result = hub.switch() [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] return self.greenlet.switch() [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] self.f(*self.args, **self.kw) [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] raise exceptions.translate_fault(task_info.error) [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Faults: ['InvalidArgument'] [ 1575.070158] env[62277]: ERROR nova.compute.manager [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] [ 1575.071260] env[62277]: DEBUG nova.compute.utils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1575.072545] env[62277]: DEBUG nova.compute.manager [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Build of instance 346748bd-b4e8-4e93-b71d-66c90a45e372 was re-scheduled: A specified parameter was not correct: fileType [ 1575.072545] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1575.072919] env[62277]: DEBUG nova.compute.manager [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1575.073104] env[62277]: DEBUG nova.compute.manager [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1575.073258] env[62277]: DEBUG nova.compute.manager [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1575.073417] env[62277]: DEBUG nova.network.neutron [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1575.421777] env[62277]: DEBUG nova.network.neutron [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1575.437172] env[62277]: INFO nova.compute.manager [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Took 0.36 seconds to deallocate network for instance. [ 1575.553145] env[62277]: INFO nova.scheduler.client.report [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Deleted allocations for instance 346748bd-b4e8-4e93-b71d-66c90a45e372 [ 1575.577377] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8227c2f9-1e2d-47d1-bc26-95044d8ffe32 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Lock "346748bd-b4e8-4e93-b71d-66c90a45e372" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 560.088s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1575.579045] env[62277]: DEBUG oslo_concurrency.lockutils [None req-22722f1a-8fe1-4435-bba7-68377be5b793 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Lock "346748bd-b4e8-4e93-b71d-66c90a45e372" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 361.900s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1575.579045] env[62277]: DEBUG oslo_concurrency.lockutils [None req-22722f1a-8fe1-4435-bba7-68377be5b793 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Acquiring lock "346748bd-b4e8-4e93-b71d-66c90a45e372-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1575.579045] env[62277]: DEBUG oslo_concurrency.lockutils [None req-22722f1a-8fe1-4435-bba7-68377be5b793 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Lock "346748bd-b4e8-4e93-b71d-66c90a45e372-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1575.579276] env[62277]: DEBUG oslo_concurrency.lockutils [None req-22722f1a-8fe1-4435-bba7-68377be5b793 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Lock "346748bd-b4e8-4e93-b71d-66c90a45e372-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1575.581042] env[62277]: INFO nova.compute.manager [None req-22722f1a-8fe1-4435-bba7-68377be5b793 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Terminating instance [ 1575.582717] env[62277]: DEBUG nova.compute.manager [None req-22722f1a-8fe1-4435-bba7-68377be5b793 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1575.582980] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-22722f1a-8fe1-4435-bba7-68377be5b793 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1575.583382] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-462fa7e8-cf04-42eb-bc96-040218151fad {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1575.592485] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4056c0ad-4e16-4008-8c18-3b6a8dd42b5a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1575.604171] env[62277]: DEBUG nova.compute.manager [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1575.621618] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-22722f1a-8fe1-4435-bba7-68377be5b793 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 346748bd-b4e8-4e93-b71d-66c90a45e372 could not be found. [ 1575.621851] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-22722f1a-8fe1-4435-bba7-68377be5b793 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1575.622055] env[62277]: INFO nova.compute.manager [None req-22722f1a-8fe1-4435-bba7-68377be5b793 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1575.622252] env[62277]: DEBUG oslo.service.loopingcall [None req-22722f1a-8fe1-4435-bba7-68377be5b793 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1575.622479] env[62277]: DEBUG nova.compute.manager [-] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1575.622584] env[62277]: DEBUG nova.network.neutron [-] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1575.648694] env[62277]: DEBUG nova.network.neutron [-] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1575.651537] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1575.651760] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1575.653480] env[62277]: INFO nova.compute.claims [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1575.657025] env[62277]: INFO nova.compute.manager [-] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] Took 0.03 seconds to deallocate network for instance. [ 1575.760291] env[62277]: DEBUG oslo_concurrency.lockutils [None req-22722f1a-8fe1-4435-bba7-68377be5b793 tempest-TenantUsagesTestJSON-526754953 tempest-TenantUsagesTestJSON-526754953-project-member] Lock "346748bd-b4e8-4e93-b71d-66c90a45e372" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.182s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1575.761154] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "346748bd-b4e8-4e93-b71d-66c90a45e372" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 327.622s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1575.761285] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 346748bd-b4e8-4e93-b71d-66c90a45e372] During sync_power_state the instance has a pending task (deleting). Skip. [ 1575.761451] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "346748bd-b4e8-4e93-b71d-66c90a45e372" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1576.001122] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16bbf298-3d86-4514-aeca-b68eae2c8c39 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1576.008804] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d1674ec-053d-4233-a565-3fac6fe1576a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1576.038184] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8477a806-ceeb-49ef-9fc0-e325dc103c30 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1576.047096] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c147a2db-d21d-417c-ba9d-286c99b4965f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1576.058332] env[62277]: DEBUG nova.compute.provider_tree [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1576.067110] env[62277]: DEBUG nova.scheduler.client.report [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1576.083011] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.431s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1576.083482] env[62277]: DEBUG nova.compute.manager [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1576.114507] env[62277]: DEBUG nova.compute.utils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1576.116372] env[62277]: DEBUG nova.compute.manager [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1576.116653] env[62277]: DEBUG nova.network.neutron [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1576.124655] env[62277]: DEBUG nova.compute.manager [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1576.191741] env[62277]: DEBUG nova.policy [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bae830f90e4f40ed80e195074895aa49', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '10008730f85447acb5c6ab439f696e27', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1576.194232] env[62277]: DEBUG nova.compute.manager [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1576.220486] env[62277]: DEBUG nova.virt.hardware [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1576.220738] env[62277]: DEBUG nova.virt.hardware [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1576.220873] env[62277]: DEBUG nova.virt.hardware [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1576.221061] env[62277]: DEBUG nova.virt.hardware [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1576.221204] env[62277]: DEBUG nova.virt.hardware [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1576.221346] env[62277]: DEBUG nova.virt.hardware [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1576.221550] env[62277]: DEBUG nova.virt.hardware [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1576.221694] env[62277]: DEBUG nova.virt.hardware [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1576.221893] env[62277]: DEBUG nova.virt.hardware [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1576.222069] env[62277]: DEBUG nova.virt.hardware [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1576.222268] env[62277]: DEBUG nova.virt.hardware [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1576.223089] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67516c4e-b6ac-44ab-b863-a6da89ec2470 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1576.231428] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4640695-7290-4136-8609-b03989991386 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1576.662365] env[62277]: DEBUG nova.network.neutron [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Successfully created port: 2ace4e56-fd15-472d-912f-14e6fe93dbb8 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1577.541164] env[62277]: DEBUG nova.compute.manager [req-1ab0d5e6-01d4-41f2-b500-5256d1a7ce08 req-e6b53485-1d6a-4452-b7f0-ca08a749e50c service nova] [instance: 13959890-87a1-45ba-98de-621373e265e7] Received event network-vif-plugged-2ace4e56-fd15-472d-912f-14e6fe93dbb8 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1577.543066] env[62277]: DEBUG oslo_concurrency.lockutils [req-1ab0d5e6-01d4-41f2-b500-5256d1a7ce08 req-e6b53485-1d6a-4452-b7f0-ca08a749e50c service nova] Acquiring lock "13959890-87a1-45ba-98de-621373e265e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1577.543066] env[62277]: DEBUG oslo_concurrency.lockutils [req-1ab0d5e6-01d4-41f2-b500-5256d1a7ce08 req-e6b53485-1d6a-4452-b7f0-ca08a749e50c service nova] Lock "13959890-87a1-45ba-98de-621373e265e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1577.543066] env[62277]: DEBUG oslo_concurrency.lockutils [req-1ab0d5e6-01d4-41f2-b500-5256d1a7ce08 req-e6b53485-1d6a-4452-b7f0-ca08a749e50c service nova] Lock "13959890-87a1-45ba-98de-621373e265e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1577.543066] env[62277]: DEBUG nova.compute.manager [req-1ab0d5e6-01d4-41f2-b500-5256d1a7ce08 req-e6b53485-1d6a-4452-b7f0-ca08a749e50c service nova] [instance: 13959890-87a1-45ba-98de-621373e265e7] No waiting events found dispatching network-vif-plugged-2ace4e56-fd15-472d-912f-14e6fe93dbb8 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1577.543066] env[62277]: WARNING nova.compute.manager [req-1ab0d5e6-01d4-41f2-b500-5256d1a7ce08 req-e6b53485-1d6a-4452-b7f0-ca08a749e50c service nova] [instance: 13959890-87a1-45ba-98de-621373e265e7] Received unexpected event network-vif-plugged-2ace4e56-fd15-472d-912f-14e6fe93dbb8 for instance with vm_state building and task_state spawning. [ 1577.615708] env[62277]: DEBUG nova.network.neutron [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Successfully updated port: 2ace4e56-fd15-472d-912f-14e6fe93dbb8 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1577.627406] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Acquiring lock "refresh_cache-13959890-87a1-45ba-98de-621373e265e7" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1577.627406] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Acquired lock "refresh_cache-13959890-87a1-45ba-98de-621373e265e7" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1577.627406] env[62277]: DEBUG nova.network.neutron [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1577.694273] env[62277]: DEBUG nova.network.neutron [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1577.877740] env[62277]: DEBUG nova.network.neutron [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Updating instance_info_cache with network_info: [{"id": "2ace4e56-fd15-472d-912f-14e6fe93dbb8", "address": "fa:16:3e:9b:fd:50", "network": {"id": "2c64e00f-4995-4827-b685-5096b1d1d064", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.56", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "63ed6630e9c140baa826f53d7a0564d1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2ace4e56-fd", "ovs_interfaceid": "2ace4e56-fd15-472d-912f-14e6fe93dbb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1577.890276] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Releasing lock "refresh_cache-13959890-87a1-45ba-98de-621373e265e7" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1577.890569] env[62277]: DEBUG nova.compute.manager [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Instance network_info: |[{"id": "2ace4e56-fd15-472d-912f-14e6fe93dbb8", "address": "fa:16:3e:9b:fd:50", "network": {"id": "2c64e00f-4995-4827-b685-5096b1d1d064", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.56", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "63ed6630e9c140baa826f53d7a0564d1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2ace4e56-fd", "ovs_interfaceid": "2ace4e56-fd15-472d-912f-14e6fe93dbb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1577.890975] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9b:fd:50', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '77aa121f-8fb6-42f3-aaea-43addfe449b2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2ace4e56-fd15-472d-912f-14e6fe93dbb8', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1577.898719] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Creating folder: Project (10008730f85447acb5c6ab439f696e27). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1577.899228] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ace90869-8f16-4beb-9504-d99e899f0026 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1577.909752] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Created folder: Project (10008730f85447acb5c6ab439f696e27) in parent group-v297781. [ 1577.909931] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Creating folder: Instances. Parent ref: group-v297857. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1577.910164] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8be9248f-a4ca-4eb8-9807-7504ed6da7a9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1577.920386] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Created folder: Instances in parent group-v297857. [ 1577.920537] env[62277]: DEBUG oslo.service.loopingcall [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1577.920710] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 13959890-87a1-45ba-98de-621373e265e7] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1577.920902] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-513f55d2-24ac-4d2f-bac5-e1d0fbcc9d33 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1577.939346] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1577.939346] env[62277]: value = "task-1405427" [ 1577.939346] env[62277]: _type = "Task" [ 1577.939346] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1577.947476] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405427, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1578.449446] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405427, 'name': CreateVM_Task, 'duration_secs': 0.273641} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1578.449564] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 13959890-87a1-45ba-98de-621373e265e7] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1578.450232] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1578.450437] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1578.450700] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1578.450936] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5778f711-19a7-4140-bb09-dc5e016fc41b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1578.455659] env[62277]: DEBUG oslo_vmware.api [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Waiting for the task: (returnval){ [ 1578.455659] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]526bc8a1-6d50-0efa-f57f-fc23e3dc8fab" [ 1578.455659] env[62277]: _type = "Task" [ 1578.455659] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1578.462443] env[62277]: DEBUG oslo_vmware.api [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]526bc8a1-6d50-0efa-f57f-fc23e3dc8fab, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1578.965780] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1578.966127] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1578.966242] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1579.567124] env[62277]: DEBUG nova.compute.manager [req-ba0b7aa3-2f5f-423f-989d-b6ba20e65fe9 req-03bda5ae-39bf-4126-ad91-88288dd1e52f service nova] [instance: 13959890-87a1-45ba-98de-621373e265e7] Received event network-changed-2ace4e56-fd15-472d-912f-14e6fe93dbb8 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1579.567336] env[62277]: DEBUG nova.compute.manager [req-ba0b7aa3-2f5f-423f-989d-b6ba20e65fe9 req-03bda5ae-39bf-4126-ad91-88288dd1e52f service nova] [instance: 13959890-87a1-45ba-98de-621373e265e7] Refreshing instance network info cache due to event network-changed-2ace4e56-fd15-472d-912f-14e6fe93dbb8. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1579.567548] env[62277]: DEBUG oslo_concurrency.lockutils [req-ba0b7aa3-2f5f-423f-989d-b6ba20e65fe9 req-03bda5ae-39bf-4126-ad91-88288dd1e52f service nova] Acquiring lock "refresh_cache-13959890-87a1-45ba-98de-621373e265e7" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1579.567713] env[62277]: DEBUG oslo_concurrency.lockutils [req-ba0b7aa3-2f5f-423f-989d-b6ba20e65fe9 req-03bda5ae-39bf-4126-ad91-88288dd1e52f service nova] Acquired lock "refresh_cache-13959890-87a1-45ba-98de-621373e265e7" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1579.567954] env[62277]: DEBUG nova.network.neutron [req-ba0b7aa3-2f5f-423f-989d-b6ba20e65fe9 req-03bda5ae-39bf-4126-ad91-88288dd1e52f service nova] [instance: 13959890-87a1-45ba-98de-621373e265e7] Refreshing network info cache for port 2ace4e56-fd15-472d-912f-14e6fe93dbb8 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1579.853461] env[62277]: DEBUG nova.network.neutron [req-ba0b7aa3-2f5f-423f-989d-b6ba20e65fe9 req-03bda5ae-39bf-4126-ad91-88288dd1e52f service nova] [instance: 13959890-87a1-45ba-98de-621373e265e7] Updated VIF entry in instance network info cache for port 2ace4e56-fd15-472d-912f-14e6fe93dbb8. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1579.853934] env[62277]: DEBUG nova.network.neutron [req-ba0b7aa3-2f5f-423f-989d-b6ba20e65fe9 req-03bda5ae-39bf-4126-ad91-88288dd1e52f service nova] [instance: 13959890-87a1-45ba-98de-621373e265e7] Updating instance_info_cache with network_info: [{"id": "2ace4e56-fd15-472d-912f-14e6fe93dbb8", "address": "fa:16:3e:9b:fd:50", "network": {"id": "2c64e00f-4995-4827-b685-5096b1d1d064", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.56", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "63ed6630e9c140baa826f53d7a0564d1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "77aa121f-8fb6-42f3-aaea-43addfe449b2", "external-id": "nsx-vlan-transportzone-288", "segmentation_id": 288, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2ace4e56-fd", "ovs_interfaceid": "2ace4e56-fd15-472d-912f-14e6fe93dbb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1579.863331] env[62277]: DEBUG oslo_concurrency.lockutils [req-ba0b7aa3-2f5f-423f-989d-b6ba20e65fe9 req-03bda5ae-39bf-4126-ad91-88288dd1e52f service nova] Releasing lock "refresh_cache-13959890-87a1-45ba-98de-621373e265e7" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1581.176767] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1582.169322] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1582.169511] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 1582.169633] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 1582.194247] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1582.194514] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1582.194661] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1582.194745] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1582.194878] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1582.195018] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1582.195142] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1582.195261] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1582.195383] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1582.195494] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 13959890-87a1-45ba-98de-621373e265e7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1582.195613] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 1583.168743] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1584.163369] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1584.650355] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "163eb4e7-33f8-4674-8a3f-5094356e250d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1584.650355] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "163eb4e7-33f8-4674-8a3f-5094356e250d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1585.163713] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1585.186118] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1585.186887] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1585.186887] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1585.196405] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1585.196748] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1585.196962] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1585.197166] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1585.198262] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e25862c4-d817-4f39-87fe-f994b22410a1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1585.207047] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-241d8710-9b72-4707-af73-c683f8d36a70 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1585.220621] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ec3b556-93ba-45f3-9373-a03cf9f948fd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1585.226586] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0dcf2f66-a284-4222-b33c-387cb38160e3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1585.256072] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181448MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1585.256197] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1585.256358] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1585.325074] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 930ff058-ab48-4c8a-8f5e-4820a3b12d50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1585.325287] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1585.325462] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 1e8429b2-7149-4832-8590-e0ebd8501176 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1585.325610] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 21bd4623-2b46-43c4-859f-c4d3bf261e1f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1585.325764] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6d759045-e1fc-43ea-a882-1ead769b6d29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1585.325891] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 32bed248-06d5-47a1-b281-47921d99dbf6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1585.326014] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8d00162c-7379-48b6-841b-f802db2582db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1585.326136] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 900160c8-a715-45a4-8709-b314fc3216d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1585.326427] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 63267d5c-d004-41c1-866a-75b9e37521b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1585.326427] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 13959890-87a1-45ba-98de-621373e265e7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1585.337499] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance b2ad5654-28f5-43b2-acb8-a7eb01f70b55 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1585.347629] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance ad6f7885-4dbd-4306-baf2-b75cc43276d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1585.356947] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 7a91e375-b4c6-4e05-a631-dae60926de1a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1585.368268] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 26dc82d6-bc3c-4b53-8fff-64578be0d404 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1585.379106] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 7e8d3614-e021-4509-87c7-1c4d68c3e570 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1585.388939] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c3af2c56-c745-43e4-9f1b-77937a1d2559 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1585.398130] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance b8467237-7a59-4302-9dd3-0cbdfc813753 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1585.407408] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance a7cc7e45-8567-4699-af83-624b1c7c5c64 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1585.416942] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 24ee6c71-7267-4fe2-8ac4-84bf1d00c024 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1585.427714] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance f468d0c6-35ed-4f8d-a3dc-aea9462aa7bb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1585.438086] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8167915d-ed3a-44b7-8eff-d585e7f6ffbf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1585.449403] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9fedbb74-ae57-4cb8-8496-2ff9c703b46e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1585.460022] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 42005809-1926-44b2-8ef6-3b6cb28a4020 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1585.471047] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 1742e8d0-3cf2-4a78-99e4-652f9664df96 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1585.481300] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1585.491146] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1585.501624] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 163eb4e7-33f8-4674-8a3f-5094356e250d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1585.501854] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1585.502180] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1585.793713] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3072ba65-34ba-4ae4-b7fb-c11712475449 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1585.801348] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0764f23-71b7-420c-bd2a-850fd5c7df1e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1585.830880] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82b36369-4403-4a1a-889d-098e498e5f68 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1585.837853] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e8a3f5d-5c9c-428e-ac8c-8c11c0f844bd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1585.850570] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1585.861465] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1585.877564] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1585.877764] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.621s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1586.860220] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1589.168444] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1589.168692] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 1601.601292] env[62277]: DEBUG oslo_concurrency.lockutils [None req-56856b71-e730-44f7-b78d-e5970a8d3f03 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Acquiring lock "13959890-87a1-45ba-98de-621373e265e7" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1610.275126] env[62277]: DEBUG oslo_concurrency.lockutils [None req-625a5d10-4ad2-4624-8618-29ea970def61 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "2fdebb33-a32b-4753-aa2e-adfc4b252fac" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1610.275399] env[62277]: DEBUG oslo_concurrency.lockutils [None req-625a5d10-4ad2-4624-8618-29ea970def61 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "2fdebb33-a32b-4753-aa2e-adfc4b252fac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1616.779930] env[62277]: DEBUG oslo_concurrency.lockutils [None req-78215bee-628c-43dc-8ea4-260fa53b23f2 tempest-VolumesAdminNegativeTest-1493025573 tempest-VolumesAdminNegativeTest-1493025573-project-member] Acquiring lock "5ed6905b-ddb5-4517-a8dc-ee8e00b53db0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1616.779930] env[62277]: DEBUG oslo_concurrency.lockutils [None req-78215bee-628c-43dc-8ea4-260fa53b23f2 tempest-VolumesAdminNegativeTest-1493025573 tempest-VolumesAdminNegativeTest-1493025573-project-member] Lock "5ed6905b-ddb5-4517-a8dc-ee8e00b53db0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1618.763139] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c98419e1-ce19-48ef-b537-74c6ede412ba tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Acquiring lock "9f8669ed-f65f-4472-9ef6-01953c48466b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1618.763523] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c98419e1-ce19-48ef-b537-74c6ede412ba tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Lock "9f8669ed-f65f-4472-9ef6-01953c48466b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1621.070318] env[62277]: DEBUG oslo_concurrency.lockutils [None req-eab2f50d-f473-463c-9429-9f09b3de3993 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Acquiring lock "3b8f2c10-5dce-44d0-bb6b-939afc01e44b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1621.070590] env[62277]: DEBUG oslo_concurrency.lockutils [None req-eab2f50d-f473-463c-9429-9f09b3de3993 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Lock "3b8f2c10-5dce-44d0-bb6b-939afc01e44b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1623.338350] env[62277]: WARNING oslo_vmware.rw_handles [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1623.338350] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1623.338350] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1623.338350] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1623.338350] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1623.338350] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1623.338350] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1623.338350] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1623.338350] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1623.338350] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1623.338350] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1623.338350] env[62277]: ERROR oslo_vmware.rw_handles [ 1623.338999] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/8d31ac27-aae5-4248-9d52-2a6330986f2b/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1623.341473] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1623.341473] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Copying Virtual Disk [datastore2] vmware_temp/8d31ac27-aae5-4248-9d52-2a6330986f2b/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/8d31ac27-aae5-4248-9d52-2a6330986f2b/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1623.341473] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7d80f818-43e9-4441-a71b-0a468b92f1f4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.349557] env[62277]: DEBUG oslo_vmware.api [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Waiting for the task: (returnval){ [ 1623.349557] env[62277]: value = "task-1405428" [ 1623.349557] env[62277]: _type = "Task" [ 1623.349557] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1623.357129] env[62277]: DEBUG oslo_vmware.api [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Task: {'id': task-1405428, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1623.860595] env[62277]: DEBUG oslo_vmware.exceptions [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1623.860788] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1623.861362] env[62277]: ERROR nova.compute.manager [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1623.861362] env[62277]: Faults: ['InvalidArgument'] [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Traceback (most recent call last): [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] yield resources [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] self.driver.spawn(context, instance, image_meta, [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] self._fetch_image_if_missing(context, vi) [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] image_cache(vi, tmp_image_ds_loc) [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] vm_util.copy_virtual_disk( [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] session._wait_for_task(vmdk_copy_task) [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] return self.wait_for_task(task_ref) [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] return evt.wait() [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] result = hub.switch() [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] return self.greenlet.switch() [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] self.f(*self.args, **self.kw) [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] raise exceptions.translate_fault(task_info.error) [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Faults: ['InvalidArgument'] [ 1623.861362] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] [ 1623.862645] env[62277]: INFO nova.compute.manager [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Terminating instance [ 1623.863270] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1623.863472] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1623.864120] env[62277]: DEBUG nova.compute.manager [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1623.864307] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1623.864539] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0a672f6d-5824-4539-8529-5d885567a8cb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.867076] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f7aabe8-068c-4d97-b043-cd7e1c9fb99b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.873914] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1623.874150] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0331165b-84b0-47f0-8764-61c0e1702908 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.878935] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1623.878935] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1623.879568] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7204a6a0-fa2a-4b40-a153-cc20c984dc07 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.884138] env[62277]: DEBUG oslo_vmware.api [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Waiting for the task: (returnval){ [ 1623.884138] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52e0ab9b-66fa-9257-ab04-8d31a7b0d14c" [ 1623.884138] env[62277]: _type = "Task" [ 1623.884138] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1623.898667] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1623.898911] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Creating directory with path [datastore2] vmware_temp/50ce76bd-a65b-4013-9b82-43de76187d52/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1623.899154] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f0b4348f-7a07-40ed-8e8c-c4f7dfa5d88f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.931389] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Created directory with path [datastore2] vmware_temp/50ce76bd-a65b-4013-9b82-43de76187d52/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1623.931626] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Fetch image to [datastore2] vmware_temp/50ce76bd-a65b-4013-9b82-43de76187d52/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1623.931810] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/50ce76bd-a65b-4013-9b82-43de76187d52/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1623.932619] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45d606b2-d1a6-47f7-bfc4-d7b482a4af18 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.939883] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f4f5569-70e6-4797-8c05-b97a51c26b88 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1623.950265] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4348e3cb-b280-47d0-ab6d-cc33cefd052b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1624.700789] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45b5a94a-4798-4622-a47a-28b6824b3854 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1624.703692] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1624.703898] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1624.704086] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Deleting the datastore file [datastore2] 930ff058-ab48-4c8a-8f5e-4820a3b12d50 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1624.704386] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e7701f61-b3bc-4486-8ccb-e16738146f15 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1624.709789] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ad32859e-8210-4277-afd4-97f4bd30b208 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1624.712378] env[62277]: DEBUG oslo_vmware.api [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Waiting for the task: (returnval){ [ 1624.712378] env[62277]: value = "task-1405430" [ 1624.712378] env[62277]: _type = "Task" [ 1624.712378] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1624.720165] env[62277]: DEBUG oslo_vmware.api [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Task: {'id': task-1405430, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1624.801052] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1624.979287] env[62277]: DEBUG oslo_vmware.rw_handles [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/50ce76bd-a65b-4013-9b82-43de76187d52/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1625.038578] env[62277]: DEBUG oslo_vmware.rw_handles [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1625.038866] env[62277]: DEBUG oslo_vmware.rw_handles [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/50ce76bd-a65b-4013-9b82-43de76187d52/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1625.222763] env[62277]: DEBUG oslo_vmware.api [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Task: {'id': task-1405430, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068442} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1625.223028] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1625.223779] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1625.223779] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1625.223779] env[62277]: INFO nova.compute.manager [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Took 1.36 seconds to destroy the instance on the hypervisor. [ 1625.227490] env[62277]: DEBUG nova.compute.claims [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1625.227552] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1625.227747] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1625.558729] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54a4cea6-0262-4de1-80f1-0349c7eceddc {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1625.566152] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c889e68a-c50e-438b-9f3b-d29306e59126 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1625.595900] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd38771d-8109-448e-b13a-d5452e81d15f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1625.603251] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-049385c7-0ddf-4fd1-9edd-4899e6013758 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1625.616262] env[62277]: DEBUG nova.compute.provider_tree [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1625.625206] env[62277]: DEBUG nova.scheduler.client.report [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1625.639690] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.412s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1625.640248] env[62277]: ERROR nova.compute.manager [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1625.640248] env[62277]: Faults: ['InvalidArgument'] [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Traceback (most recent call last): [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] self.driver.spawn(context, instance, image_meta, [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] self._fetch_image_if_missing(context, vi) [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] image_cache(vi, tmp_image_ds_loc) [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] vm_util.copy_virtual_disk( [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] session._wait_for_task(vmdk_copy_task) [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] return self.wait_for_task(task_ref) [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] return evt.wait() [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] result = hub.switch() [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] return self.greenlet.switch() [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] self.f(*self.args, **self.kw) [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] raise exceptions.translate_fault(task_info.error) [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Faults: ['InvalidArgument'] [ 1625.640248] env[62277]: ERROR nova.compute.manager [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] [ 1625.641410] env[62277]: DEBUG nova.compute.utils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1625.644037] env[62277]: DEBUG nova.compute.manager [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Build of instance 930ff058-ab48-4c8a-8f5e-4820a3b12d50 was re-scheduled: A specified parameter was not correct: fileType [ 1625.644037] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1625.644460] env[62277]: DEBUG nova.compute.manager [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1625.644637] env[62277]: DEBUG nova.compute.manager [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1625.644804] env[62277]: DEBUG nova.compute.manager [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1625.644962] env[62277]: DEBUG nova.network.neutron [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1626.134629] env[62277]: DEBUG nova.network.neutron [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1626.156653] env[62277]: INFO nova.compute.manager [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Took 0.51 seconds to deallocate network for instance. [ 1626.283179] env[62277]: INFO nova.scheduler.client.report [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Deleted allocations for instance 930ff058-ab48-4c8a-8f5e-4820a3b12d50 [ 1626.307633] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02032dd3-2223-4863-a4ba-d10d1098f4b3 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Lock "930ff058-ab48-4c8a-8f5e-4820a3b12d50" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 604.878s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1626.308590] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b49b6dd-08af-401b-8be0-58a31ad8d1f7 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Lock "930ff058-ab48-4c8a-8f5e-4820a3b12d50" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 404.045s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1626.309569] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b49b6dd-08af-401b-8be0-58a31ad8d1f7 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Acquiring lock "930ff058-ab48-4c8a-8f5e-4820a3b12d50-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1626.309896] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b49b6dd-08af-401b-8be0-58a31ad8d1f7 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Lock "930ff058-ab48-4c8a-8f5e-4820a3b12d50-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1626.310037] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b49b6dd-08af-401b-8be0-58a31ad8d1f7 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Lock "930ff058-ab48-4c8a-8f5e-4820a3b12d50-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1626.313154] env[62277]: INFO nova.compute.manager [None req-3b49b6dd-08af-401b-8be0-58a31ad8d1f7 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Terminating instance [ 1626.315133] env[62277]: DEBUG nova.compute.manager [None req-3b49b6dd-08af-401b-8be0-58a31ad8d1f7 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1626.315133] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3b49b6dd-08af-401b-8be0-58a31ad8d1f7 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1626.315532] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-439614c4-68cb-457a-ab83-7ff4d82f36d9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1626.320994] env[62277]: DEBUG nova.compute.manager [None req-4d5d312e-2a71-43df-bdb5-922651d130c8 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: b2ad5654-28f5-43b2-acb8-a7eb01f70b55] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1626.328104] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa750f1b-452f-4934-991d-7971b07d1142 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1626.346307] env[62277]: DEBUG nova.compute.manager [None req-4d5d312e-2a71-43df-bdb5-922651d130c8 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: b2ad5654-28f5-43b2-acb8-a7eb01f70b55] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1626.361319] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-3b49b6dd-08af-401b-8be0-58a31ad8d1f7 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 930ff058-ab48-4c8a-8f5e-4820a3b12d50 could not be found. [ 1626.361319] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3b49b6dd-08af-401b-8be0-58a31ad8d1f7 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1626.361319] env[62277]: INFO nova.compute.manager [None req-3b49b6dd-08af-401b-8be0-58a31ad8d1f7 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1626.361319] env[62277]: DEBUG oslo.service.loopingcall [None req-3b49b6dd-08af-401b-8be0-58a31ad8d1f7 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1626.361319] env[62277]: DEBUG nova.compute.manager [-] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1626.361319] env[62277]: DEBUG nova.network.neutron [-] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1626.384481] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4d5d312e-2a71-43df-bdb5-922651d130c8 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "b2ad5654-28f5-43b2-acb8-a7eb01f70b55" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.363s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1626.402705] env[62277]: DEBUG nova.compute.manager [None req-06fd2190-8827-44cb-9e1d-adc9bd43f0c5 tempest-VolumesAdminNegativeTest-1493025573 tempest-VolumesAdminNegativeTest-1493025573-project-member] [instance: ad6f7885-4dbd-4306-baf2-b75cc43276d3] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1626.403176] env[62277]: DEBUG nova.network.neutron [-] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1626.414472] env[62277]: INFO nova.compute.manager [-] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] Took 0.05 seconds to deallocate network for instance. [ 1626.447583] env[62277]: DEBUG nova.compute.manager [None req-06fd2190-8827-44cb-9e1d-adc9bd43f0c5 tempest-VolumesAdminNegativeTest-1493025573 tempest-VolumesAdminNegativeTest-1493025573-project-member] [instance: ad6f7885-4dbd-4306-baf2-b75cc43276d3] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1626.472123] env[62277]: DEBUG oslo_concurrency.lockutils [None req-06fd2190-8827-44cb-9e1d-adc9bd43f0c5 tempest-VolumesAdminNegativeTest-1493025573 tempest-VolumesAdminNegativeTest-1493025573-project-member] Lock "ad6f7885-4dbd-4306-baf2-b75cc43276d3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.033s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1626.495962] env[62277]: DEBUG nova.compute.manager [None req-658d427f-3d3b-4040-a5fe-541cd7995905 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 7a91e375-b4c6-4e05-a631-dae60926de1a] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1626.526544] env[62277]: DEBUG nova.compute.manager [None req-658d427f-3d3b-4040-a5fe-541cd7995905 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 7a91e375-b4c6-4e05-a631-dae60926de1a] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1626.538542] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b49b6dd-08af-401b-8be0-58a31ad8d1f7 tempest-ServersWithSpecificFlavorTestJSON-1064486923 tempest-ServersWithSpecificFlavorTestJSON-1064486923-project-member] Lock "930ff058-ab48-4c8a-8f5e-4820a3b12d50" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.230s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1626.540179] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "930ff058-ab48-4c8a-8f5e-4820a3b12d50" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 378.401s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1626.540826] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 930ff058-ab48-4c8a-8f5e-4820a3b12d50] During sync_power_state the instance has a pending task (deleting). Skip. [ 1626.541089] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "930ff058-ab48-4c8a-8f5e-4820a3b12d50" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1626.559762] env[62277]: DEBUG oslo_concurrency.lockutils [None req-658d427f-3d3b-4040-a5fe-541cd7995905 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Lock "7a91e375-b4c6-4e05-a631-dae60926de1a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.855s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1626.570615] env[62277]: DEBUG nova.compute.manager [None req-bb84464e-5140-4875-9c67-d5873702ff38 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 26dc82d6-bc3c-4b53-8fff-64578be0d404] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1626.596770] env[62277]: DEBUG nova.compute.manager [None req-bb84464e-5140-4875-9c67-d5873702ff38 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 26dc82d6-bc3c-4b53-8fff-64578be0d404] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1626.618622] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bb84464e-5140-4875-9c67-d5873702ff38 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Lock "26dc82d6-bc3c-4b53-8fff-64578be0d404" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.043s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1626.632280] env[62277]: DEBUG nova.compute.manager [None req-1cccbcb6-e42b-448d-8c4b-38ff0573d14d tempest-ListServersNegativeTestJSON-1271685526 tempest-ListServersNegativeTestJSON-1271685526-project-member] [instance: 7e8d3614-e021-4509-87c7-1c4d68c3e570] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1626.655934] env[62277]: DEBUG nova.compute.manager [None req-1cccbcb6-e42b-448d-8c4b-38ff0573d14d tempest-ListServersNegativeTestJSON-1271685526 tempest-ListServersNegativeTestJSON-1271685526-project-member] [instance: 7e8d3614-e021-4509-87c7-1c4d68c3e570] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1626.680138] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1cccbcb6-e42b-448d-8c4b-38ff0573d14d tempest-ListServersNegativeTestJSON-1271685526 tempest-ListServersNegativeTestJSON-1271685526-project-member] Lock "7e8d3614-e021-4509-87c7-1c4d68c3e570" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.451s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1626.689446] env[62277]: DEBUG nova.compute.manager [None req-1cccbcb6-e42b-448d-8c4b-38ff0573d14d tempest-ListServersNegativeTestJSON-1271685526 tempest-ListServersNegativeTestJSON-1271685526-project-member] [instance: c3af2c56-c745-43e4-9f1b-77937a1d2559] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1626.713937] env[62277]: DEBUG nova.compute.manager [None req-1cccbcb6-e42b-448d-8c4b-38ff0573d14d tempest-ListServersNegativeTestJSON-1271685526 tempest-ListServersNegativeTestJSON-1271685526-project-member] [instance: c3af2c56-c745-43e4-9f1b-77937a1d2559] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1626.737922] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1cccbcb6-e42b-448d-8c4b-38ff0573d14d tempest-ListServersNegativeTestJSON-1271685526 tempest-ListServersNegativeTestJSON-1271685526-project-member] Lock "c3af2c56-c745-43e4-9f1b-77937a1d2559" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.477s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1626.746992] env[62277]: DEBUG nova.compute.manager [None req-1cccbcb6-e42b-448d-8c4b-38ff0573d14d tempest-ListServersNegativeTestJSON-1271685526 tempest-ListServersNegativeTestJSON-1271685526-project-member] [instance: b8467237-7a59-4302-9dd3-0cbdfc813753] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1626.776479] env[62277]: DEBUG nova.compute.manager [None req-1cccbcb6-e42b-448d-8c4b-38ff0573d14d tempest-ListServersNegativeTestJSON-1271685526 tempest-ListServersNegativeTestJSON-1271685526-project-member] [instance: b8467237-7a59-4302-9dd3-0cbdfc813753] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1626.799042] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1cccbcb6-e42b-448d-8c4b-38ff0573d14d tempest-ListServersNegativeTestJSON-1271685526 tempest-ListServersNegativeTestJSON-1271685526-project-member] Lock "b8467237-7a59-4302-9dd3-0cbdfc813753" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.501s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1626.807550] env[62277]: DEBUG nova.compute.manager [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1626.866680] env[62277]: DEBUG oslo_concurrency.lockutils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1626.866953] env[62277]: DEBUG oslo_concurrency.lockutils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1626.868482] env[62277]: INFO nova.compute.claims [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1627.210563] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f224a3ec-fc9f-46c1-accf-86b9836cc5fc {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1627.219441] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cdedb7a-8e51-4ed1-9b32-45d6cf3bdbe8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1627.251186] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31d2c488-3b5d-4cd0-b0e1-78143e5bed2d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1627.258976] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62cf5f07-af71-4a9e-af01-36674edc9945 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1627.271987] env[62277]: DEBUG nova.compute.provider_tree [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1627.281396] env[62277]: DEBUG nova.scheduler.client.report [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1627.298450] env[62277]: DEBUG oslo_concurrency.lockutils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.430s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1627.298909] env[62277]: DEBUG nova.compute.manager [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1627.333889] env[62277]: DEBUG nova.compute.utils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1627.335356] env[62277]: DEBUG nova.compute.manager [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1627.335517] env[62277]: DEBUG nova.network.neutron [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1627.349182] env[62277]: DEBUG nova.compute.manager [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1627.396917] env[62277]: DEBUG nova.policy [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '34a955383f4e41209e89bdb051d2940d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9ac6ecb637fb4c9d8d6542633709cd80', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1627.417758] env[62277]: DEBUG nova.compute.manager [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1627.449490] env[62277]: DEBUG nova.virt.hardware [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1627.449490] env[62277]: DEBUG nova.virt.hardware [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1627.449490] env[62277]: DEBUG nova.virt.hardware [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1627.449490] env[62277]: DEBUG nova.virt.hardware [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1627.449490] env[62277]: DEBUG nova.virt.hardware [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1627.449490] env[62277]: DEBUG nova.virt.hardware [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1627.450853] env[62277]: DEBUG nova.virt.hardware [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1627.450853] env[62277]: DEBUG nova.virt.hardware [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1627.450853] env[62277]: DEBUG nova.virt.hardware [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1627.450853] env[62277]: DEBUG nova.virt.hardware [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1627.450853] env[62277]: DEBUG nova.virt.hardware [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1627.451544] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b25a67e0-939b-4898-b917-22878c3be29f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1627.460283] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b85f10d2-445b-4854-9aa7-76af9b6f801b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1627.801087] env[62277]: DEBUG nova.network.neutron [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Successfully created port: 7d0db8c9-4016-49d1-86a8-87405d0654e0 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1628.743701] env[62277]: DEBUG nova.compute.manager [req-235f0422-43b2-4e36-bb02-0f8fe190e184 req-18888461-e8aa-44a1-bfeb-cf765ea274d3 service nova] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Received event network-vif-plugged-7d0db8c9-4016-49d1-86a8-87405d0654e0 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1628.743983] env[62277]: DEBUG oslo_concurrency.lockutils [req-235f0422-43b2-4e36-bb02-0f8fe190e184 req-18888461-e8aa-44a1-bfeb-cf765ea274d3 service nova] Acquiring lock "a7cc7e45-8567-4699-af83-624b1c7c5c64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1628.744110] env[62277]: DEBUG oslo_concurrency.lockutils [req-235f0422-43b2-4e36-bb02-0f8fe190e184 req-18888461-e8aa-44a1-bfeb-cf765ea274d3 service nova] Lock "a7cc7e45-8567-4699-af83-624b1c7c5c64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1628.744280] env[62277]: DEBUG oslo_concurrency.lockutils [req-235f0422-43b2-4e36-bb02-0f8fe190e184 req-18888461-e8aa-44a1-bfeb-cf765ea274d3 service nova] Lock "a7cc7e45-8567-4699-af83-624b1c7c5c64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1628.744441] env[62277]: DEBUG nova.compute.manager [req-235f0422-43b2-4e36-bb02-0f8fe190e184 req-18888461-e8aa-44a1-bfeb-cf765ea274d3 service nova] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] No waiting events found dispatching network-vif-plugged-7d0db8c9-4016-49d1-86a8-87405d0654e0 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1628.744666] env[62277]: WARNING nova.compute.manager [req-235f0422-43b2-4e36-bb02-0f8fe190e184 req-18888461-e8aa-44a1-bfeb-cf765ea274d3 service nova] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Received unexpected event network-vif-plugged-7d0db8c9-4016-49d1-86a8-87405d0654e0 for instance with vm_state building and task_state spawning. [ 1628.775529] env[62277]: DEBUG nova.network.neutron [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Successfully updated port: 7d0db8c9-4016-49d1-86a8-87405d0654e0 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1628.787251] env[62277]: DEBUG oslo_concurrency.lockutils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Acquiring lock "refresh_cache-a7cc7e45-8567-4699-af83-624b1c7c5c64" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1628.787689] env[62277]: DEBUG oslo_concurrency.lockutils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Acquired lock "refresh_cache-a7cc7e45-8567-4699-af83-624b1c7c5c64" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1628.787905] env[62277]: DEBUG nova.network.neutron [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1628.827499] env[62277]: DEBUG nova.network.neutron [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1628.992201] env[62277]: DEBUG nova.network.neutron [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Updating instance_info_cache with network_info: [{"id": "7d0db8c9-4016-49d1-86a8-87405d0654e0", "address": "fa:16:3e:3f:c1:7b", "network": {"id": "f897bb48-da0c-43a8-8ec5-20641aaff395", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-678825762-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9ac6ecb637fb4c9d8d6542633709cd80", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "195e328b-e41a-49f5-9e51-546b8ea8ceba", "external-id": "nsx-vlan-transportzone-735", "segmentation_id": 735, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7d0db8c9-40", "ovs_interfaceid": "7d0db8c9-4016-49d1-86a8-87405d0654e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1629.005306] env[62277]: DEBUG oslo_concurrency.lockutils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Releasing lock "refresh_cache-a7cc7e45-8567-4699-af83-624b1c7c5c64" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1629.005564] env[62277]: DEBUG nova.compute.manager [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Instance network_info: |[{"id": "7d0db8c9-4016-49d1-86a8-87405d0654e0", "address": "fa:16:3e:3f:c1:7b", "network": {"id": "f897bb48-da0c-43a8-8ec5-20641aaff395", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-678825762-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9ac6ecb637fb4c9d8d6542633709cd80", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "195e328b-e41a-49f5-9e51-546b8ea8ceba", "external-id": "nsx-vlan-transportzone-735", "segmentation_id": 735, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7d0db8c9-40", "ovs_interfaceid": "7d0db8c9-4016-49d1-86a8-87405d0654e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1629.005970] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3f:c1:7b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '195e328b-e41a-49f5-9e51-546b8ea8ceba', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7d0db8c9-4016-49d1-86a8-87405d0654e0', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1629.013535] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Creating folder: Project (9ac6ecb637fb4c9d8d6542633709cd80). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1629.014122] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-04b82007-d26a-4d64-9b51-c58c3ff9ec2d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.024306] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Created folder: Project (9ac6ecb637fb4c9d8d6542633709cd80) in parent group-v297781. [ 1629.024755] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Creating folder: Instances. Parent ref: group-v297860. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1629.024755] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e702ec99-de60-4024-b44b-cd3b1734fdad {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.033958] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Created folder: Instances in parent group-v297860. [ 1629.034199] env[62277]: DEBUG oslo.service.loopingcall [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1629.034385] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1629.034574] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c6980084-7e85-49e6-b3c0-914ff6e3bc7c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.053794] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1629.053794] env[62277]: value = "task-1405433" [ 1629.053794] env[62277]: _type = "Task" [ 1629.053794] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1629.061305] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405433, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1629.565052] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405433, 'name': CreateVM_Task, 'duration_secs': 0.300926} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1629.565052] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1629.571325] env[62277]: DEBUG oslo_concurrency.lockutils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1629.571493] env[62277]: DEBUG oslo_concurrency.lockutils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1629.571851] env[62277]: DEBUG oslo_concurrency.lockutils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1629.572105] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c9b28b4c-0fa9-4258-8781-0d2ce0f557ef {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.576450] env[62277]: DEBUG oslo_vmware.api [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Waiting for the task: (returnval){ [ 1629.576450] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52cf46b4-0f60-34bc-4142-852a5abcabbc" [ 1629.576450] env[62277]: _type = "Task" [ 1629.576450] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1629.584046] env[62277]: DEBUG oslo_vmware.api [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52cf46b4-0f60-34bc-4142-852a5abcabbc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1630.086231] env[62277]: DEBUG oslo_concurrency.lockutils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1630.086542] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1630.086705] env[62277]: DEBUG oslo_concurrency.lockutils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1630.841065] env[62277]: DEBUG nova.compute.manager [req-dfd99fc3-b1d2-4b9c-95d0-350631c54fca req-2dac1ac8-fd12-499c-a815-097147673073 service nova] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Received event network-changed-7d0db8c9-4016-49d1-86a8-87405d0654e0 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1630.841303] env[62277]: DEBUG nova.compute.manager [req-dfd99fc3-b1d2-4b9c-95d0-350631c54fca req-2dac1ac8-fd12-499c-a815-097147673073 service nova] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Refreshing instance network info cache due to event network-changed-7d0db8c9-4016-49d1-86a8-87405d0654e0. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1630.841492] env[62277]: DEBUG oslo_concurrency.lockutils [req-dfd99fc3-b1d2-4b9c-95d0-350631c54fca req-2dac1ac8-fd12-499c-a815-097147673073 service nova] Acquiring lock "refresh_cache-a7cc7e45-8567-4699-af83-624b1c7c5c64" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1630.841626] env[62277]: DEBUG oslo_concurrency.lockutils [req-dfd99fc3-b1d2-4b9c-95d0-350631c54fca req-2dac1ac8-fd12-499c-a815-097147673073 service nova] Acquired lock "refresh_cache-a7cc7e45-8567-4699-af83-624b1c7c5c64" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1630.841783] env[62277]: DEBUG nova.network.neutron [req-dfd99fc3-b1d2-4b9c-95d0-350631c54fca req-2dac1ac8-fd12-499c-a815-097147673073 service nova] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Refreshing network info cache for port 7d0db8c9-4016-49d1-86a8-87405d0654e0 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1631.139851] env[62277]: DEBUG nova.network.neutron [req-dfd99fc3-b1d2-4b9c-95d0-350631c54fca req-2dac1ac8-fd12-499c-a815-097147673073 service nova] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Updated VIF entry in instance network info cache for port 7d0db8c9-4016-49d1-86a8-87405d0654e0. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1631.140323] env[62277]: DEBUG nova.network.neutron [req-dfd99fc3-b1d2-4b9c-95d0-350631c54fca req-2dac1ac8-fd12-499c-a815-097147673073 service nova] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Updating instance_info_cache with network_info: [{"id": "7d0db8c9-4016-49d1-86a8-87405d0654e0", "address": "fa:16:3e:3f:c1:7b", "network": {"id": "f897bb48-da0c-43a8-8ec5-20641aaff395", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-678825762-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9ac6ecb637fb4c9d8d6542633709cd80", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "195e328b-e41a-49f5-9e51-546b8ea8ceba", "external-id": "nsx-vlan-transportzone-735", "segmentation_id": 735, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7d0db8c9-40", "ovs_interfaceid": "7d0db8c9-4016-49d1-86a8-87405d0654e0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1631.149399] env[62277]: DEBUG oslo_concurrency.lockutils [req-dfd99fc3-b1d2-4b9c-95d0-350631c54fca req-2dac1ac8-fd12-499c-a815-097147673073 service nova] Releasing lock "refresh_cache-a7cc7e45-8567-4699-af83-624b1c7c5c64" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1638.744349] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e37a08e3-976c-45f1-b47f-15fef7ecad2e tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Acquiring lock "a7cc7e45-8567-4699-af83-624b1c7c5c64" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1642.169589] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1642.169828] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 1642.169912] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 1642.191123] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1642.191301] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1642.191404] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1642.191527] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1642.191649] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1642.191767] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1642.191883] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1642.191999] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1642.192133] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 13959890-87a1-45ba-98de-621373e265e7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1642.192314] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1642.192387] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 1642.192855] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1644.187291] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1645.168666] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1647.168502] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1647.168797] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1647.168902] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1647.169086] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1647.180704] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1647.180926] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1647.181111] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1647.181270] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1647.182537] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e285439-509a-4c44-923c-7fc82a917c09 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.191348] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5059010-c645-4611-a0d6-084309fe22ca {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.205075] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff7aba6e-a15c-4372-a938-deafc729d205 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.211070] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c4b7662-7f2a-4aa6-8bf9-43a44a5fe2c5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.238925] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181413MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1647.239041] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1647.239238] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1647.311448] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1647.311607] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 1e8429b2-7149-4832-8590-e0ebd8501176 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1647.311735] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 21bd4623-2b46-43c4-859f-c4d3bf261e1f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1647.311860] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6d759045-e1fc-43ea-a882-1ead769b6d29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1647.311985] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 32bed248-06d5-47a1-b281-47921d99dbf6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1647.312122] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8d00162c-7379-48b6-841b-f802db2582db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1647.312244] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 900160c8-a715-45a4-8709-b314fc3216d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1647.312359] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 63267d5c-d004-41c1-866a-75b9e37521b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1647.312475] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 13959890-87a1-45ba-98de-621373e265e7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1647.312587] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance a7cc7e45-8567-4699-af83-624b1c7c5c64 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1647.323017] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 24ee6c71-7267-4fe2-8ac4-84bf1d00c024 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1647.332680] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance f468d0c6-35ed-4f8d-a3dc-aea9462aa7bb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1647.342951] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8167915d-ed3a-44b7-8eff-d585e7f6ffbf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1647.351822] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9fedbb74-ae57-4cb8-8496-2ff9c703b46e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1647.360567] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 42005809-1926-44b2-8ef6-3b6cb28a4020 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1647.369590] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 1742e8d0-3cf2-4a78-99e4-652f9664df96 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1647.378037] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1647.386422] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1647.395246] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 163eb4e7-33f8-4674-8a3f-5094356e250d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1647.403604] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 2fdebb33-a32b-4753-aa2e-adfc4b252fac has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1647.412221] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 5ed6905b-ddb5-4517-a8dc-ee8e00b53db0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1647.420199] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9f8669ed-f65f-4472-9ef6-01953c48466b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1647.428791] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 3b8f2c10-5dce-44d0-bb6b-939afc01e44b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1647.429011] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1647.429171] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1647.671391] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-445bc0dd-04f9-4d77-8528-49b5826fae65 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.679920] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5cfeab9-2db9-4891-bdfd-689578424794 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.708746] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49561cd8-ebd1-4cc1-9d83-12135beba7e9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.715824] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ba97495-eaf0-4f59-a193-c99d0d5e42be {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1647.729685] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1647.737966] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1647.751749] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1647.751749] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.512s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1649.752279] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1649.752671] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 1651.515354] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1f6a5bbb-98c9-46d5-8d0c-79b5fb6e8c4a tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "5a5ff5bf-d965-42e2-aa8b-67be4b5f7362" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1651.516704] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1f6a5bbb-98c9-46d5-8d0c-79b5fb6e8c4a tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "5a5ff5bf-d965-42e2-aa8b-67be4b5f7362" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1652.476805] env[62277]: DEBUG oslo_concurrency.lockutils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "400beb27-a709-4ef4-851e-5caaab9ca60b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1652.477418] env[62277]: DEBUG oslo_concurrency.lockutils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "400beb27-a709-4ef4-851e-5caaab9ca60b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1657.593248] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ef4504dd-12e5-4062-9061-5368df2cee5e tempest-AttachVolumeNegativeTest-184472694 tempest-AttachVolumeNegativeTest-184472694-project-member] Acquiring lock "272391f1-a349-4525-91ec-75b3ba7aeb1c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1657.593584] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ef4504dd-12e5-4062-9061-5368df2cee5e tempest-AttachVolumeNegativeTest-184472694 tempest-AttachVolumeNegativeTest-184472694-project-member] Lock "272391f1-a349-4525-91ec-75b3ba7aeb1c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1659.672678] env[62277]: DEBUG oslo_concurrency.lockutils [None req-76e59b66-33b0-4df5-9a06-03c5bacc073b tempest-ServerPasswordTestJSON-23033146 tempest-ServerPasswordTestJSON-23033146-project-member] Acquiring lock "227394fe-d0c6-48c8-aed2-433ce34e34f8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1659.672951] env[62277]: DEBUG oslo_concurrency.lockutils [None req-76e59b66-33b0-4df5-9a06-03c5bacc073b tempest-ServerPasswordTestJSON-23033146 tempest-ServerPasswordTestJSON-23033146-project-member] Lock "227394fe-d0c6-48c8-aed2-433ce34e34f8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1674.829200] env[62277]: WARNING oslo_vmware.rw_handles [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1674.829200] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1674.829200] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1674.829200] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1674.829200] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1674.829200] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1674.829200] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1674.829200] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1674.829200] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1674.829200] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1674.829200] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1674.829200] env[62277]: ERROR oslo_vmware.rw_handles [ 1674.829659] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/50ce76bd-a65b-4013-9b82-43de76187d52/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1674.831362] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1674.831600] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Copying Virtual Disk [datastore2] vmware_temp/50ce76bd-a65b-4013-9b82-43de76187d52/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/50ce76bd-a65b-4013-9b82-43de76187d52/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1674.831884] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0b30c415-3a2d-4f07-829d-7bb84e3068fa {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1674.839505] env[62277]: DEBUG oslo_vmware.api [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Waiting for the task: (returnval){ [ 1674.839505] env[62277]: value = "task-1405434" [ 1674.839505] env[62277]: _type = "Task" [ 1674.839505] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1674.847466] env[62277]: DEBUG oslo_vmware.api [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Task: {'id': task-1405434, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1675.350466] env[62277]: DEBUG oslo_vmware.exceptions [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1675.350742] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1675.351381] env[62277]: ERROR nova.compute.manager [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1675.351381] env[62277]: Faults: ['InvalidArgument'] [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Traceback (most recent call last): [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] yield resources [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] self.driver.spawn(context, instance, image_meta, [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] self._fetch_image_if_missing(context, vi) [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] image_cache(vi, tmp_image_ds_loc) [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] vm_util.copy_virtual_disk( [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] session._wait_for_task(vmdk_copy_task) [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] return self.wait_for_task(task_ref) [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] return evt.wait() [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] result = hub.switch() [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] return self.greenlet.switch() [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] self.f(*self.args, **self.kw) [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] raise exceptions.translate_fault(task_info.error) [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Faults: ['InvalidArgument'] [ 1675.351381] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] [ 1675.352383] env[62277]: INFO nova.compute.manager [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Terminating instance [ 1675.353207] env[62277]: DEBUG oslo_concurrency.lockutils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1675.353412] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1675.353650] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-73f6dc94-821d-4a3a-bf96-d45cf47ca0b2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.355899] env[62277]: DEBUG nova.compute.manager [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1675.356099] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1675.356840] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3790b521-ff88-4fb5-af51-28ee00b263df {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.363926] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1675.364163] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c428cc9a-f5e0-4869-bc8c-c9eb06ba3ddf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.366349] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1675.366515] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1675.367605] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9177e2a6-f688-4555-baa3-a47255c02aae {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.372043] env[62277]: DEBUG oslo_vmware.api [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Waiting for the task: (returnval){ [ 1675.372043] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5299c548-038a-e17d-7481-6287d5fb498b" [ 1675.372043] env[62277]: _type = "Task" [ 1675.372043] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1675.380401] env[62277]: DEBUG oslo_vmware.api [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5299c548-038a-e17d-7481-6287d5fb498b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1675.498564] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1675.498848] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1675.499093] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Deleting the datastore file [datastore2] 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1675.499359] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8b3ba3b7-cf2e-4df7-abf9-9a1e703fa362 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.505677] env[62277]: DEBUG oslo_vmware.api [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Waiting for the task: (returnval){ [ 1675.505677] env[62277]: value = "task-1405436" [ 1675.505677] env[62277]: _type = "Task" [ 1675.505677] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1675.513292] env[62277]: DEBUG oslo_vmware.api [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Task: {'id': task-1405436, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1675.882410] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1675.882644] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Creating directory with path [datastore2] vmware_temp/6f86faaf-2bd1-42cb-aca7-f1c437f565c3/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1675.882840] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-77f84b49-1033-4b74-aa12-70f55ab565e2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.894269] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Created directory with path [datastore2] vmware_temp/6f86faaf-2bd1-42cb-aca7-f1c437f565c3/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1675.894429] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Fetch image to [datastore2] vmware_temp/6f86faaf-2bd1-42cb-aca7-f1c437f565c3/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1675.894592] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/6f86faaf-2bd1-42cb-aca7-f1c437f565c3/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1675.895443] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87a7e79f-dc36-4373-9afc-6a5aaf421845 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.901983] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db8307fc-dec1-4c12-b226-298d168d7c93 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.911165] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62e99c1c-c081-4328-96a1-2e73d51b0327 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.941217] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1897acf-8f44-4d44-97f3-7364879d5c83 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.947725] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8280ce92-1005-4c0d-9d14-5912a083ee49 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.967643] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1676.016489] env[62277]: DEBUG oslo_vmware.api [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Task: {'id': task-1405436, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.083685} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1676.016924] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1676.017166] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1676.017435] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1676.017627] env[62277]: INFO nova.compute.manager [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Took 0.66 seconds to destroy the instance on the hypervisor. [ 1676.019840] env[62277]: DEBUG nova.compute.claims [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1676.020013] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1676.020231] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1676.123140] env[62277]: DEBUG oslo_vmware.rw_handles [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6f86faaf-2bd1-42cb-aca7-f1c437f565c3/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1676.183249] env[62277]: DEBUG oslo_vmware.rw_handles [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1676.183442] env[62277]: DEBUG oslo_vmware.rw_handles [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6f86faaf-2bd1-42cb-aca7-f1c437f565c3/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1676.374075] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-daf053ce-bf64-4231-9821-7d5f70e7fa23 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1676.381028] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01245e4f-750a-46b5-b0c1-28fb4e27fb71 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1676.411449] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5902327-a565-41c7-962a-7a74f9ab391d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1676.418858] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7af5c2e-791a-4b72-ace8-ee75684636aa {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1676.431822] env[62277]: DEBUG nova.compute.provider_tree [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1676.442084] env[62277]: DEBUG nova.scheduler.client.report [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1676.456277] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.436s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1676.456811] env[62277]: ERROR nova.compute.manager [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1676.456811] env[62277]: Faults: ['InvalidArgument'] [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Traceback (most recent call last): [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] self.driver.spawn(context, instance, image_meta, [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] self._fetch_image_if_missing(context, vi) [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] image_cache(vi, tmp_image_ds_loc) [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] vm_util.copy_virtual_disk( [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] session._wait_for_task(vmdk_copy_task) [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] return self.wait_for_task(task_ref) [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] return evt.wait() [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] result = hub.switch() [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] return self.greenlet.switch() [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] self.f(*self.args, **self.kw) [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] raise exceptions.translate_fault(task_info.error) [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Faults: ['InvalidArgument'] [ 1676.456811] env[62277]: ERROR nova.compute.manager [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] [ 1676.461442] env[62277]: DEBUG nova.compute.utils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1676.461442] env[62277]: DEBUG nova.compute.manager [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Build of instance 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0 was re-scheduled: A specified parameter was not correct: fileType [ 1676.461442] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1676.461577] env[62277]: DEBUG nova.compute.manager [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1676.461979] env[62277]: DEBUG nova.compute.manager [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1676.461979] env[62277]: DEBUG nova.compute.manager [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1676.462117] env[62277]: DEBUG nova.network.neutron [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1677.364030] env[62277]: DEBUG nova.network.neutron [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1677.375610] env[62277]: INFO nova.compute.manager [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Took 0.91 seconds to deallocate network for instance. [ 1677.471122] env[62277]: INFO nova.scheduler.client.report [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Deleted allocations for instance 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0 [ 1677.494384] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a7e0e457-6419-4155-89db-a115bb39e293 tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 653.871s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1677.495475] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f8f6abaf-c69a-41fb-8aab-0ac498ab193e tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 454.817s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1677.495698] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f8f6abaf-c69a-41fb-8aab-0ac498ab193e tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Acquiring lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1677.495904] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f8f6abaf-c69a-41fb-8aab-0ac498ab193e tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1677.496085] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f8f6abaf-c69a-41fb-8aab-0ac498ab193e tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1677.498570] env[62277]: INFO nova.compute.manager [None req-f8f6abaf-c69a-41fb-8aab-0ac498ab193e tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Terminating instance [ 1677.500349] env[62277]: DEBUG nova.compute.manager [None req-f8f6abaf-c69a-41fb-8aab-0ac498ab193e tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1677.500540] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-f8f6abaf-c69a-41fb-8aab-0ac498ab193e tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1677.501207] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-afca20c3-93e2-484b-8706-f64a26047726 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.507052] env[62277]: DEBUG nova.compute.manager [None req-4fa6a66c-d9d1-4bdc-bcb0-922a4670790e tempest-ServerActionsV293TestJSON-1090793783 tempest-ServerActionsV293TestJSON-1090793783-project-member] [instance: 24ee6c71-7267-4fe2-8ac4-84bf1d00c024] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1677.514674] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5165bb1d-0ba5-449d-a50d-4ab44ce02ac7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.534017] env[62277]: DEBUG nova.compute.manager [None req-4fa6a66c-d9d1-4bdc-bcb0-922a4670790e tempest-ServerActionsV293TestJSON-1090793783 tempest-ServerActionsV293TestJSON-1090793783-project-member] [instance: 24ee6c71-7267-4fe2-8ac4-84bf1d00c024] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1677.543337] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-f8f6abaf-c69a-41fb-8aab-0ac498ab193e tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0 could not be found. [ 1677.543515] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-f8f6abaf-c69a-41fb-8aab-0ac498ab193e tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1677.543682] env[62277]: INFO nova.compute.manager [None req-f8f6abaf-c69a-41fb-8aab-0ac498ab193e tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1677.543915] env[62277]: DEBUG oslo.service.loopingcall [None req-f8f6abaf-c69a-41fb-8aab-0ac498ab193e tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1677.544319] env[62277]: DEBUG nova.compute.manager [-] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1677.544464] env[62277]: DEBUG nova.network.neutron [-] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1677.560277] env[62277]: DEBUG oslo_concurrency.lockutils [None req-4fa6a66c-d9d1-4bdc-bcb0-922a4670790e tempest-ServerActionsV293TestJSON-1090793783 tempest-ServerActionsV293TestJSON-1090793783-project-member] Lock "24ee6c71-7267-4fe2-8ac4-84bf1d00c024" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 225.630s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1677.570063] env[62277]: DEBUG nova.compute.manager [None req-e9443422-ad1c-4641-a2d4-d11014892d36 tempest-ServerMetadataNegativeTestJSON-1764342254 tempest-ServerMetadataNegativeTestJSON-1764342254-project-member] [instance: f468d0c6-35ed-4f8d-a3dc-aea9462aa7bb] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1677.581043] env[62277]: DEBUG nova.network.neutron [-] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1677.588441] env[62277]: INFO nova.compute.manager [-] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] Took 0.04 seconds to deallocate network for instance. [ 1677.602938] env[62277]: DEBUG nova.compute.manager [None req-e9443422-ad1c-4641-a2d4-d11014892d36 tempest-ServerMetadataNegativeTestJSON-1764342254 tempest-ServerMetadataNegativeTestJSON-1764342254-project-member] [instance: f468d0c6-35ed-4f8d-a3dc-aea9462aa7bb] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1677.625532] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e9443422-ad1c-4641-a2d4-d11014892d36 tempest-ServerMetadataNegativeTestJSON-1764342254 tempest-ServerMetadataNegativeTestJSON-1764342254-project-member] Lock "f468d0c6-35ed-4f8d-a3dc-aea9462aa7bb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.706s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1677.635073] env[62277]: DEBUG nova.compute.manager [None req-0fbd05d1-7e13-4efb-9287-1c5673b04c42 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 8167915d-ed3a-44b7-8eff-d585e7f6ffbf] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1677.660496] env[62277]: DEBUG nova.compute.manager [None req-0fbd05d1-7e13-4efb-9287-1c5673b04c42 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 8167915d-ed3a-44b7-8eff-d585e7f6ffbf] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1677.679619] env[62277]: DEBUG oslo_concurrency.lockutils [None req-f8f6abaf-c69a-41fb-8aab-0ac498ab193e tempest-ServersTestMultiNic-820412428 tempest-ServersTestMultiNic-820412428-project-member] Lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.184s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1677.681020] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 429.542s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1677.681611] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 3b9b51fe-d558-40ef-b22a-2b8958b3e1a0] During sync_power_state the instance has a pending task (deleting). Skip. [ 1677.681611] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "3b9b51fe-d558-40ef-b22a-2b8958b3e1a0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1677.683207] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0fbd05d1-7e13-4efb-9287-1c5673b04c42 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "8167915d-ed3a-44b7-8eff-d585e7f6ffbf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 223.446s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1677.691191] env[62277]: DEBUG nova.compute.manager [None req-678307d2-4680-407a-b1a4-a1159637463f tempest-ServerGroupTestJSON-441783187 tempest-ServerGroupTestJSON-441783187-project-member] [instance: 9fedbb74-ae57-4cb8-8496-2ff9c703b46e] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1677.714582] env[62277]: DEBUG nova.compute.manager [None req-678307d2-4680-407a-b1a4-a1159637463f tempest-ServerGroupTestJSON-441783187 tempest-ServerGroupTestJSON-441783187-project-member] [instance: 9fedbb74-ae57-4cb8-8496-2ff9c703b46e] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1677.736428] env[62277]: DEBUG oslo_concurrency.lockutils [None req-678307d2-4680-407a-b1a4-a1159637463f tempest-ServerGroupTestJSON-441783187 tempest-ServerGroupTestJSON-441783187-project-member] Lock "9fedbb74-ae57-4cb8-8496-2ff9c703b46e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 220.957s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1677.748047] env[62277]: DEBUG nova.compute.manager [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1677.794713] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1677.794959] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1677.796502] env[62277]: INFO nova.compute.claims [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1678.109267] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6eab36e7-7c39-40f0-9a48-8e7288c1a2f4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.117385] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57134d66-dc6b-4062-ba2e-06f463877151 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.148933] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-719a6965-5c95-402b-8be3-47be4e2770eb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.156598] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0309d8c5-d400-473d-9284-4f25c06dd0f2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.170080] env[62277]: DEBUG nova.compute.provider_tree [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1678.179378] env[62277]: DEBUG nova.scheduler.client.report [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1678.194241] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.399s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1678.194620] env[62277]: DEBUG nova.compute.manager [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1678.228991] env[62277]: DEBUG nova.compute.utils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1678.230832] env[62277]: DEBUG nova.compute.manager [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1678.231486] env[62277]: DEBUG nova.network.neutron [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1678.242117] env[62277]: DEBUG nova.compute.manager [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1678.303935] env[62277]: DEBUG nova.policy [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '013359a6ab0644799bb338125a970c37', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '47f21dc2b2ad4fe692324779a4a84760', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1678.316140] env[62277]: DEBUG nova.compute.manager [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1678.359970] env[62277]: DEBUG nova.virt.hardware [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1678.360249] env[62277]: DEBUG nova.virt.hardware [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1678.360407] env[62277]: DEBUG nova.virt.hardware [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1678.360588] env[62277]: DEBUG nova.virt.hardware [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1678.360735] env[62277]: DEBUG nova.virt.hardware [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1678.360878] env[62277]: DEBUG nova.virt.hardware [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1678.361221] env[62277]: DEBUG nova.virt.hardware [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1678.361836] env[62277]: DEBUG nova.virt.hardware [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1678.361836] env[62277]: DEBUG nova.virt.hardware [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1678.361836] env[62277]: DEBUG nova.virt.hardware [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1678.361968] env[62277]: DEBUG nova.virt.hardware [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1678.362966] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a90203f-406d-41ea-abaf-c56fd6c5e188 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.372115] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4a50d63-6379-444a-a345-8932525f8061 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.732577] env[62277]: DEBUG nova.network.neutron [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Successfully created port: 803188c7-5708-409a-9465-308c5b321922 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1679.724532] env[62277]: DEBUG nova.network.neutron [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Successfully updated port: 803188c7-5708-409a-9465-308c5b321922 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1679.735063] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "refresh_cache-42005809-1926-44b2-8ef6-3b6cb28a4020" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1679.735222] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquired lock "refresh_cache-42005809-1926-44b2-8ef6-3b6cb28a4020" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1679.735401] env[62277]: DEBUG nova.network.neutron [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1679.776527] env[62277]: DEBUG nova.network.neutron [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1679.936092] env[62277]: DEBUG nova.network.neutron [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Updating instance_info_cache with network_info: [{"id": "803188c7-5708-409a-9465-308c5b321922", "address": "fa:16:3e:09:a2:29", "network": {"id": "0fdd7696-a8f7-4fbe-88a6-4c5254049911", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-706519722-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "47f21dc2b2ad4fe692324779a4a84760", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7150f662-0cf1-44f9-ae14-d70f479649b6", "external-id": "nsx-vlan-transportzone-712", "segmentation_id": 712, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap803188c7-57", "ovs_interfaceid": "803188c7-5708-409a-9465-308c5b321922", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1679.950813] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Releasing lock "refresh_cache-42005809-1926-44b2-8ef6-3b6cb28a4020" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1679.951178] env[62277]: DEBUG nova.compute.manager [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Instance network_info: |[{"id": "803188c7-5708-409a-9465-308c5b321922", "address": "fa:16:3e:09:a2:29", "network": {"id": "0fdd7696-a8f7-4fbe-88a6-4c5254049911", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-706519722-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "47f21dc2b2ad4fe692324779a4a84760", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7150f662-0cf1-44f9-ae14-d70f479649b6", "external-id": "nsx-vlan-transportzone-712", "segmentation_id": 712, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap803188c7-57", "ovs_interfaceid": "803188c7-5708-409a-9465-308c5b321922", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1679.951611] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:09:a2:29', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7150f662-0cf1-44f9-ae14-d70f479649b6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '803188c7-5708-409a-9465-308c5b321922', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1679.959381] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Creating folder: Project (47f21dc2b2ad4fe692324779a4a84760). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1679.959928] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7e3d4dd1-581d-40c7-9dcf-d1b417f28215 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.970279] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Created folder: Project (47f21dc2b2ad4fe692324779a4a84760) in parent group-v297781. [ 1679.970455] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Creating folder: Instances. Parent ref: group-v297863. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1679.970686] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-db2d09eb-aa40-4ff5-8b7d-959002077487 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.979212] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Created folder: Instances in parent group-v297863. [ 1679.979434] env[62277]: DEBUG oslo.service.loopingcall [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1679.979610] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1679.979834] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a2b08834-d2f9-4aff-94dd-3cb04ab63492 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.998809] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1679.998809] env[62277]: value = "task-1405439" [ 1679.998809] env[62277]: _type = "Task" [ 1679.998809] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1680.006172] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405439, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1680.509541] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405439, 'name': CreateVM_Task, 'duration_secs': 0.298139} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1680.509733] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1680.510400] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1680.510566] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1680.510902] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1680.511181] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0fcdee00-9b5c-4bee-8322-779f594308bd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1680.515839] env[62277]: DEBUG oslo_vmware.api [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for the task: (returnval){ [ 1680.515839] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5219477c-2509-40c4-3439-55b0382587c2" [ 1680.515839] env[62277]: _type = "Task" [ 1680.515839] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1680.524065] env[62277]: DEBUG oslo_vmware.api [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5219477c-2509-40c4-3439-55b0382587c2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1681.026039] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1681.026349] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1681.026383] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1681.272893] env[62277]: DEBUG nova.compute.manager [req-9f41339e-78b8-4067-8ff1-9043aa75e3c6 req-0fe252f2-3da5-4357-8281-4b163361515c service nova] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Received event network-vif-plugged-803188c7-5708-409a-9465-308c5b321922 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1681.273135] env[62277]: DEBUG oslo_concurrency.lockutils [req-9f41339e-78b8-4067-8ff1-9043aa75e3c6 req-0fe252f2-3da5-4357-8281-4b163361515c service nova] Acquiring lock "42005809-1926-44b2-8ef6-3b6cb28a4020-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1681.273346] env[62277]: DEBUG oslo_concurrency.lockutils [req-9f41339e-78b8-4067-8ff1-9043aa75e3c6 req-0fe252f2-3da5-4357-8281-4b163361515c service nova] Lock "42005809-1926-44b2-8ef6-3b6cb28a4020-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1681.273512] env[62277]: DEBUG oslo_concurrency.lockutils [req-9f41339e-78b8-4067-8ff1-9043aa75e3c6 req-0fe252f2-3da5-4357-8281-4b163361515c service nova] Lock "42005809-1926-44b2-8ef6-3b6cb28a4020-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1681.273685] env[62277]: DEBUG nova.compute.manager [req-9f41339e-78b8-4067-8ff1-9043aa75e3c6 req-0fe252f2-3da5-4357-8281-4b163361515c service nova] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] No waiting events found dispatching network-vif-plugged-803188c7-5708-409a-9465-308c5b321922 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1681.273833] env[62277]: WARNING nova.compute.manager [req-9f41339e-78b8-4067-8ff1-9043aa75e3c6 req-0fe252f2-3da5-4357-8281-4b163361515c service nova] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Received unexpected event network-vif-plugged-803188c7-5708-409a-9465-308c5b321922 for instance with vm_state building and task_state spawning. [ 1681.273985] env[62277]: DEBUG nova.compute.manager [req-9f41339e-78b8-4067-8ff1-9043aa75e3c6 req-0fe252f2-3da5-4357-8281-4b163361515c service nova] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Received event network-changed-803188c7-5708-409a-9465-308c5b321922 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1681.274153] env[62277]: DEBUG nova.compute.manager [req-9f41339e-78b8-4067-8ff1-9043aa75e3c6 req-0fe252f2-3da5-4357-8281-4b163361515c service nova] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Refreshing instance network info cache due to event network-changed-803188c7-5708-409a-9465-308c5b321922. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1681.274335] env[62277]: DEBUG oslo_concurrency.lockutils [req-9f41339e-78b8-4067-8ff1-9043aa75e3c6 req-0fe252f2-3da5-4357-8281-4b163361515c service nova] Acquiring lock "refresh_cache-42005809-1926-44b2-8ef6-3b6cb28a4020" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1681.274467] env[62277]: DEBUG oslo_concurrency.lockutils [req-9f41339e-78b8-4067-8ff1-9043aa75e3c6 req-0fe252f2-3da5-4357-8281-4b163361515c service nova] Acquired lock "refresh_cache-42005809-1926-44b2-8ef6-3b6cb28a4020" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1681.274618] env[62277]: DEBUG nova.network.neutron [req-9f41339e-78b8-4067-8ff1-9043aa75e3c6 req-0fe252f2-3da5-4357-8281-4b163361515c service nova] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Refreshing network info cache for port 803188c7-5708-409a-9465-308c5b321922 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1681.719993] env[62277]: DEBUG nova.network.neutron [req-9f41339e-78b8-4067-8ff1-9043aa75e3c6 req-0fe252f2-3da5-4357-8281-4b163361515c service nova] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Updated VIF entry in instance network info cache for port 803188c7-5708-409a-9465-308c5b321922. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1681.720194] env[62277]: DEBUG nova.network.neutron [req-9f41339e-78b8-4067-8ff1-9043aa75e3c6 req-0fe252f2-3da5-4357-8281-4b163361515c service nova] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Updating instance_info_cache with network_info: [{"id": "803188c7-5708-409a-9465-308c5b321922", "address": "fa:16:3e:09:a2:29", "network": {"id": "0fdd7696-a8f7-4fbe-88a6-4c5254049911", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-706519722-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "47f21dc2b2ad4fe692324779a4a84760", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7150f662-0cf1-44f9-ae14-d70f479649b6", "external-id": "nsx-vlan-transportzone-712", "segmentation_id": 712, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap803188c7-57", "ovs_interfaceid": "803188c7-5708-409a-9465-308c5b321922", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1681.730413] env[62277]: DEBUG oslo_concurrency.lockutils [req-9f41339e-78b8-4067-8ff1-9043aa75e3c6 req-0fe252f2-3da5-4357-8281-4b163361515c service nova] Releasing lock "refresh_cache-42005809-1926-44b2-8ef6-3b6cb28a4020" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1683.838764] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Acquiring lock "9f7d4431-d5ea-4f9b-888b-77a6a7772047" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1683.839167] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Lock "9f7d4431-d5ea-4f9b-888b-77a6a7772047" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1695.169463] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5a8d7a5c-20c5-4603-904d-25e24a06d75d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "42005809-1926-44b2-8ef6-3b6cb28a4020" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1703.169640] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1704.163831] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1704.168353] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1704.168522] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 1704.168654] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 1704.191170] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1704.191442] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1704.191442] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1704.191562] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1704.191685] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1704.191804] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1704.191921] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1704.192052] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 13959890-87a1-45ba-98de-621373e265e7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1704.192176] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1704.192293] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1704.192409] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 1706.169470] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1706.192562] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1707.169031] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1707.169031] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1708.168353] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1708.182973] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1708.183225] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1708.183401] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1708.183559] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1708.184730] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e59d5d2-64bc-4df4-bd79-7fa45267c118 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.193699] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53143a64-9b29-4d0a-99aa-414f2d7d2bed {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.208483] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bf3d15a-3bb2-4463-b7d8-9faa0cb5b0c4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.213696] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2a9ff28-1470-4d49-ba51-cd5be0f2e2bf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.244580] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181444MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1708.244773] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1708.244930] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1708.323134] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 1e8429b2-7149-4832-8590-e0ebd8501176 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1708.323308] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 21bd4623-2b46-43c4-859f-c4d3bf261e1f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1708.323448] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6d759045-e1fc-43ea-a882-1ead769b6d29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1708.323572] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 32bed248-06d5-47a1-b281-47921d99dbf6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1708.323692] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8d00162c-7379-48b6-841b-f802db2582db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1708.323810] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 900160c8-a715-45a4-8709-b314fc3216d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1708.323930] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 63267d5c-d004-41c1-866a-75b9e37521b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1708.324061] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 13959890-87a1-45ba-98de-621373e265e7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1708.324181] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance a7cc7e45-8567-4699-af83-624b1c7c5c64 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1708.324294] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 42005809-1926-44b2-8ef6-3b6cb28a4020 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1708.335283] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 1742e8d0-3cf2-4a78-99e4-652f9664df96 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1708.345375] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1708.354910] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1708.364884] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 163eb4e7-33f8-4674-8a3f-5094356e250d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1708.375791] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 2fdebb33-a32b-4753-aa2e-adfc4b252fac has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1708.385429] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 5ed6905b-ddb5-4517-a8dc-ee8e00b53db0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1708.395420] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9f8669ed-f65f-4472-9ef6-01953c48466b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1708.404619] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 3b8f2c10-5dce-44d0-bb6b-939afc01e44b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1708.413717] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 400beb27-a709-4ef4-851e-5caaab9ca60b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1708.422801] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 272391f1-a349-4525-91ec-75b3ba7aeb1c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1708.432031] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 227394fe-d0c6-48c8-aed2-433ce34e34f8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1708.441706] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9f7d4431-d5ea-4f9b-888b-77a6a7772047 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1708.442009] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1708.442185] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1708.670262] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48af5033-2e93-4ca6-917e-ab90484626e6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.677904] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8dc340a-3c46-4ce3-b097-15492073ecad {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.706884] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a5bd5c9-13bd-4a37-83cd-c292c5ba9488 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.713383] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4311aaba-9f2d-4037-9fa7-04205536982d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.728725] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1708.736997] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1708.750535] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1708.750712] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.506s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1709.751525] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1710.168569] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1710.168569] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 1722.738492] env[62277]: WARNING oslo_vmware.rw_handles [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1722.738492] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1722.738492] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1722.738492] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1722.738492] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1722.738492] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1722.738492] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1722.738492] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1722.738492] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1722.738492] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1722.738492] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1722.738492] env[62277]: ERROR oslo_vmware.rw_handles [ 1722.739272] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/6f86faaf-2bd1-42cb-aca7-f1c437f565c3/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1722.741181] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1722.741450] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Copying Virtual Disk [datastore2] vmware_temp/6f86faaf-2bd1-42cb-aca7-f1c437f565c3/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/6f86faaf-2bd1-42cb-aca7-f1c437f565c3/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1722.741767] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d39c1769-b3c0-4738-8769-38a8d0bf2666 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1722.750508] env[62277]: DEBUG oslo_vmware.api [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Waiting for the task: (returnval){ [ 1722.750508] env[62277]: value = "task-1405440" [ 1722.750508] env[62277]: _type = "Task" [ 1722.750508] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1722.759089] env[62277]: DEBUG oslo_vmware.api [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Task: {'id': task-1405440, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1723.261047] env[62277]: DEBUG oslo_vmware.exceptions [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1723.261326] env[62277]: DEBUG oslo_concurrency.lockutils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1723.261873] env[62277]: ERROR nova.compute.manager [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1723.261873] env[62277]: Faults: ['InvalidArgument'] [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Traceback (most recent call last): [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] yield resources [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] self.driver.spawn(context, instance, image_meta, [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] self._fetch_image_if_missing(context, vi) [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] image_cache(vi, tmp_image_ds_loc) [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] vm_util.copy_virtual_disk( [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] session._wait_for_task(vmdk_copy_task) [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] return self.wait_for_task(task_ref) [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] return evt.wait() [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] result = hub.switch() [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] return self.greenlet.switch() [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] self.f(*self.args, **self.kw) [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] raise exceptions.translate_fault(task_info.error) [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Faults: ['InvalidArgument'] [ 1723.261873] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] [ 1723.262789] env[62277]: INFO nova.compute.manager [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Terminating instance [ 1723.263760] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1723.263961] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1723.264217] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1e60e56c-96ad-450e-870f-e3b90ad04c34 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1723.266603] env[62277]: DEBUG nova.compute.manager [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1723.266789] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1723.267505] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1252200a-489f-433f-bc7e-8bca1cda4f08 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1723.273880] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1723.274092] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-30da92b1-c12a-4b2a-af6a-2b23958582df {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1723.276246] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1723.276412] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1723.277317] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c32d6190-9be9-4c77-97b3-fdbabbb6106b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1723.281928] env[62277]: DEBUG oslo_vmware.api [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Waiting for the task: (returnval){ [ 1723.281928] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52d20b81-ec02-12a7-2b5e-ffaeff88c0da" [ 1723.281928] env[62277]: _type = "Task" [ 1723.281928] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1723.288687] env[62277]: DEBUG oslo_vmware.api [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52d20b81-ec02-12a7-2b5e-ffaeff88c0da, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1723.346020] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1723.346240] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1723.346423] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Deleting the datastore file [datastore2] 1e8429b2-7149-4832-8590-e0ebd8501176 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1723.346685] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-79359554-9e8a-44a4-89c7-14776cf597ba {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1723.352600] env[62277]: DEBUG oslo_vmware.api [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Waiting for the task: (returnval){ [ 1723.352600] env[62277]: value = "task-1405442" [ 1723.352600] env[62277]: _type = "Task" [ 1723.352600] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1723.359857] env[62277]: DEBUG oslo_vmware.api [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Task: {'id': task-1405442, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1723.792573] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1723.792986] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Creating directory with path [datastore2] vmware_temp/c97ff1ba-cfc9-4b08-9fa2-0b45eedebdb6/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1723.793062] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9cae2e97-080e-42c8-bbd6-41f994641152 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1723.803546] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Created directory with path [datastore2] vmware_temp/c97ff1ba-cfc9-4b08-9fa2-0b45eedebdb6/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1723.803717] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Fetch image to [datastore2] vmware_temp/c97ff1ba-cfc9-4b08-9fa2-0b45eedebdb6/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1723.803882] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/c97ff1ba-cfc9-4b08-9fa2-0b45eedebdb6/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1723.804577] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc162733-857b-4063-9e9c-85b30cf5a82e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1723.810956] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94d48ca3-e18d-4b88-a4fb-392ba660ca86 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1723.819612] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36a33f38-b09f-4d90-9760-3852a4094db1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1723.849900] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1555a3aa-7301-4782-bdd4-a7da83244321 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1723.857725] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-86c71d1a-6d3d-4f6e-be49-c83c79976dfc {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1723.861761] env[62277]: DEBUG oslo_vmware.api [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Task: {'id': task-1405442, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076992} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1723.862268] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1723.862467] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1723.862653] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1723.862820] env[62277]: INFO nova.compute.manager [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1723.864809] env[62277]: DEBUG nova.compute.claims [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1723.864986] env[62277]: DEBUG oslo_concurrency.lockutils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1723.865205] env[62277]: DEBUG oslo_concurrency.lockutils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1723.878520] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1724.019483] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1724.021286] env[62277]: ERROR nova.compute.manager [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 6f125163-af69-40e9-92ae-3b8a01d74b60. [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Traceback (most recent call last): [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] result = getattr(controller, method)(*args, **kwargs) [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self._get(image_id) [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] resp, body = self.http_client.get(url, headers=header) [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self.request(url, 'GET', **kwargs) [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self._handle_response(resp) [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] raise exc.from_response(resp, resp.content) [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] During handling of the above exception, another exception occurred: [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Traceback (most recent call last): [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] yield resources [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self.driver.spawn(context, instance, image_meta, [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1724.021286] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self._fetch_image_if_missing(context, vi) [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] image_fetch(context, vi, tmp_image_ds_loc) [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] images.fetch_image( [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] metadata = IMAGE_API.get(context, image_ref) [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return session.show(context, image_id, [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] _reraise_translated_image_exception(image_id) [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] raise new_exc.with_traceback(exc_trace) [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] result = getattr(controller, method)(*args, **kwargs) [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self._get(image_id) [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] resp, body = self.http_client.get(url, headers=header) [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self.request(url, 'GET', **kwargs) [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self._handle_response(resp) [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] raise exc.from_response(resp, resp.content) [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] nova.exception.ImageNotAuthorized: Not authorized for image 6f125163-af69-40e9-92ae-3b8a01d74b60. [ 1724.022112] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] [ 1724.022112] env[62277]: INFO nova.compute.manager [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Terminating instance [ 1724.023213] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1724.023467] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1724.026273] env[62277]: DEBUG nova.compute.manager [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1724.026465] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1724.026718] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1750648a-b253-48e0-a1bf-f09b8be93ee5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.029639] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7101cefa-4111-42b4-b177-1671ae555247 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.037070] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1724.037309] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2271a4c2-77d5-4ea4-8fb0-487f3fb01fe7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.039966] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1724.040209] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1724.041219] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cf534d36-cefe-40d5-b1fe-32370d80054d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.047491] env[62277]: DEBUG oslo_vmware.api [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Waiting for the task: (returnval){ [ 1724.047491] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5228d731-8381-c77d-7f5b-fe05dd278365" [ 1724.047491] env[62277]: _type = "Task" [ 1724.047491] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1724.060237] env[62277]: DEBUG oslo_vmware.api [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5228d731-8381-c77d-7f5b-fe05dd278365, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1724.095273] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1724.095628] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1724.095760] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Deleting the datastore file [datastore2] 21bd4623-2b46-43c4-859f-c4d3bf261e1f {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1724.096061] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b4796a59-1ea7-49e9-bd1f-a5c1f39751a4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.103021] env[62277]: DEBUG oslo_vmware.api [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Waiting for the task: (returnval){ [ 1724.103021] env[62277]: value = "task-1405444" [ 1724.103021] env[62277]: _type = "Task" [ 1724.103021] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1724.113693] env[62277]: DEBUG oslo_vmware.api [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Task: {'id': task-1405444, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1724.175867] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-039ac6f7-bc1c-4ae5-925d-e1e093175fe7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.183153] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5628173f-d77f-4d1e-9db2-d67b77eaad60 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.213457] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d27faee-8227-49b4-b816-3bf064259e90 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.220794] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e4d1e83-6afd-4e89-8094-4fb94d2600fb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.234534] env[62277]: DEBUG nova.compute.provider_tree [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1724.243233] env[62277]: DEBUG nova.scheduler.client.report [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1724.259386] env[62277]: DEBUG oslo_concurrency.lockutils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.394s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1724.259890] env[62277]: ERROR nova.compute.manager [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1724.259890] env[62277]: Faults: ['InvalidArgument'] [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Traceback (most recent call last): [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] self.driver.spawn(context, instance, image_meta, [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] self._fetch_image_if_missing(context, vi) [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] image_cache(vi, tmp_image_ds_loc) [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] vm_util.copy_virtual_disk( [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] session._wait_for_task(vmdk_copy_task) [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] return self.wait_for_task(task_ref) [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] return evt.wait() [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] result = hub.switch() [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] return self.greenlet.switch() [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] self.f(*self.args, **self.kw) [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] raise exceptions.translate_fault(task_info.error) [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Faults: ['InvalidArgument'] [ 1724.259890] env[62277]: ERROR nova.compute.manager [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] [ 1724.260723] env[62277]: DEBUG nova.compute.utils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1724.262358] env[62277]: DEBUG nova.compute.manager [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Build of instance 1e8429b2-7149-4832-8590-e0ebd8501176 was re-scheduled: A specified parameter was not correct: fileType [ 1724.262358] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1724.262724] env[62277]: DEBUG nova.compute.manager [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1724.262898] env[62277]: DEBUG nova.compute.manager [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1724.263078] env[62277]: DEBUG nova.compute.manager [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1724.263273] env[62277]: DEBUG nova.network.neutron [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1724.558835] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1724.559132] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Creating directory with path [datastore2] vmware_temp/4e751d7f-8f17-4bfe-992d-759bffb545f0/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1724.559383] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3fa33d83-658e-49e4-9ca3-1ac0df7de421 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.571295] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Created directory with path [datastore2] vmware_temp/4e751d7f-8f17-4bfe-992d-759bffb545f0/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1724.571295] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Fetch image to [datastore2] vmware_temp/4e751d7f-8f17-4bfe-992d-759bffb545f0/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1724.571295] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/4e751d7f-8f17-4bfe-992d-759bffb545f0/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1724.572856] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f04b97d1-ed71-491f-bb61-eff7c26ec895 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.579328] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b99a7b8b-932c-4ad1-a6df-f59f2cfe2c5a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.588840] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5635eefb-e716-429b-b6de-51850194a4bd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.592761] env[62277]: DEBUG nova.network.neutron [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1724.622514] env[62277]: INFO nova.compute.manager [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Took 0.36 seconds to deallocate network for instance. [ 1724.628200] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b64c31e8-4075-4e47-8025-19147575cf4b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.635995] env[62277]: DEBUG oslo_vmware.api [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Task: {'id': task-1405444, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073904} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1724.637542] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1724.638405] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1724.638405] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1724.638405] env[62277]: INFO nova.compute.manager [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1724.639789] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fd05844d-1eea-486b-b8d3-9ed3ca9f4dae {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.642326] env[62277]: DEBUG nova.compute.claims [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1724.642519] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1724.642778] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1724.661685] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1724.724740] env[62277]: DEBUG oslo_vmware.rw_handles [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4e751d7f-8f17-4bfe-992d-759bffb545f0/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1724.726874] env[62277]: INFO nova.scheduler.client.report [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Deleted allocations for instance 1e8429b2-7149-4832-8590-e0ebd8501176 [ 1724.792359] env[62277]: DEBUG oslo_vmware.rw_handles [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1724.792359] env[62277]: DEBUG oslo_vmware.rw_handles [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4e751d7f-8f17-4bfe-992d-759bffb545f0/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1724.806648] env[62277]: DEBUG oslo_concurrency.lockutils [None req-44f9e31d-9f8c-45bf-83e1-40abb109ef94 tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Lock "1e8429b2-7149-4832-8590-e0ebd8501176" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 684.382s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1724.807925] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0d29b2f9-dc86-48fa-abac-7c0ed78c0e4e tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Lock "1e8429b2-7149-4832-8590-e0ebd8501176" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 484.960s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1724.808145] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0d29b2f9-dc86-48fa-abac-7c0ed78c0e4e tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Acquiring lock "1e8429b2-7149-4832-8590-e0ebd8501176-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1724.808348] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0d29b2f9-dc86-48fa-abac-7c0ed78c0e4e tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Lock "1e8429b2-7149-4832-8590-e0ebd8501176-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1724.808509] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0d29b2f9-dc86-48fa-abac-7c0ed78c0e4e tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Lock "1e8429b2-7149-4832-8590-e0ebd8501176-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1724.810697] env[62277]: INFO nova.compute.manager [None req-0d29b2f9-dc86-48fa-abac-7c0ed78c0e4e tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Terminating instance [ 1724.812528] env[62277]: DEBUG nova.compute.manager [None req-0d29b2f9-dc86-48fa-abac-7c0ed78c0e4e tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1724.813397] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-0d29b2f9-dc86-48fa-abac-7c0ed78c0e4e tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1724.813397] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-384535e6-9a34-4c1d-a8d7-04b0f6c0fc77 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.817256] env[62277]: DEBUG nova.compute.manager [None req-d7e34775-9706-428a-8656-4894ed1026cf tempest-ServerAddressesTestJSON-868450768 tempest-ServerAddressesTestJSON-868450768-project-member] [instance: 1742e8d0-3cf2-4a78-99e4-652f9664df96] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1724.825796] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e093acd7-7d91-4b27-82ec-8399ea4c14e9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.858017] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-0d29b2f9-dc86-48fa-abac-7c0ed78c0e4e tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1e8429b2-7149-4832-8590-e0ebd8501176 could not be found. [ 1724.858017] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-0d29b2f9-dc86-48fa-abac-7c0ed78c0e4e tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1724.858017] env[62277]: INFO nova.compute.manager [None req-0d29b2f9-dc86-48fa-abac-7c0ed78c0e4e tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1724.858017] env[62277]: DEBUG oslo.service.loopingcall [None req-0d29b2f9-dc86-48fa-abac-7c0ed78c0e4e tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1724.860702] env[62277]: DEBUG nova.compute.manager [None req-d7e34775-9706-428a-8656-4894ed1026cf tempest-ServerAddressesTestJSON-868450768 tempest-ServerAddressesTestJSON-868450768-project-member] [instance: 1742e8d0-3cf2-4a78-99e4-652f9664df96] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1724.861914] env[62277]: DEBUG nova.compute.manager [-] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1724.862169] env[62277]: DEBUG nova.network.neutron [-] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1724.880709] env[62277]: DEBUG oslo_concurrency.lockutils [None req-d7e34775-9706-428a-8656-4894ed1026cf tempest-ServerAddressesTestJSON-868450768 tempest-ServerAddressesTestJSON-868450768-project-member] Lock "1742e8d0-3cf2-4a78-99e4-652f9664df96" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.786s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1724.887053] env[62277]: DEBUG nova.network.neutron [-] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1724.891325] env[62277]: DEBUG nova.compute.manager [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1724.896938] env[62277]: INFO nova.compute.manager [-] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] Took 0.03 seconds to deallocate network for instance. [ 1724.941595] env[62277]: DEBUG oslo_concurrency.lockutils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1725.005988] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0d29b2f9-dc86-48fa-abac-7c0ed78c0e4e tempest-ServerAddressesNegativeTestJSON-1911791836 tempest-ServerAddressesNegativeTestJSON-1911791836-project-member] Lock "1e8429b2-7149-4832-8590-e0ebd8501176" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.198s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1725.006832] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "1e8429b2-7149-4832-8590-e0ebd8501176" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 476.868s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1725.007046] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 1e8429b2-7149-4832-8590-e0ebd8501176] During sync_power_state the instance has a pending task (deleting). Skip. [ 1725.007220] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "1e8429b2-7149-4832-8590-e0ebd8501176" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1725.030500] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee8f973d-3c5e-4bc0-a2b0-8e253b53b095 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.039927] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c187818b-5cef-4ee5-8487-a9949b6fca78 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.068763] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c28dbdbf-1090-45bd-833b-eb5c06f40106 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.075552] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9184294d-7a51-46b3-8aaa-c8bd6fb9d421 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.088743] env[62277]: DEBUG nova.compute.provider_tree [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1725.096888] env[62277]: DEBUG nova.scheduler.client.report [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1725.110324] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.467s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1725.111122] env[62277]: ERROR nova.compute.manager [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image 6f125163-af69-40e9-92ae-3b8a01d74b60. [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Traceback (most recent call last): [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] result = getattr(controller, method)(*args, **kwargs) [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self._get(image_id) [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] resp, body = self.http_client.get(url, headers=header) [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self.request(url, 'GET', **kwargs) [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self._handle_response(resp) [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] raise exc.from_response(resp, resp.content) [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] During handling of the above exception, another exception occurred: [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Traceback (most recent call last): [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self.driver.spawn(context, instance, image_meta, [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self._fetch_image_if_missing(context, vi) [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1725.111122] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] image_fetch(context, vi, tmp_image_ds_loc) [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] images.fetch_image( [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] metadata = IMAGE_API.get(context, image_ref) [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return session.show(context, image_id, [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] _reraise_translated_image_exception(image_id) [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] raise new_exc.with_traceback(exc_trace) [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] result = getattr(controller, method)(*args, **kwargs) [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self._get(image_id) [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] resp, body = self.http_client.get(url, headers=header) [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self.request(url, 'GET', **kwargs) [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self._handle_response(resp) [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] raise exc.from_response(resp, resp.content) [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] nova.exception.ImageNotAuthorized: Not authorized for image 6f125163-af69-40e9-92ae-3b8a01d74b60. [ 1725.111948] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] [ 1725.112687] env[62277]: DEBUG nova.compute.utils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Not authorized for image 6f125163-af69-40e9-92ae-3b8a01d74b60. {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1725.113064] env[62277]: DEBUG oslo_concurrency.lockutils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.172s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1725.114470] env[62277]: INFO nova.compute.claims [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1725.117024] env[62277]: DEBUG nova.compute.manager [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Build of instance 21bd4623-2b46-43c4-859f-c4d3bf261e1f was re-scheduled: Not authorized for image 6f125163-af69-40e9-92ae-3b8a01d74b60. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1725.117476] env[62277]: DEBUG nova.compute.manager [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1725.117751] env[62277]: DEBUG nova.compute.manager [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1725.117929] env[62277]: DEBUG nova.compute.manager [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1725.118102] env[62277]: DEBUG nova.network.neutron [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1725.238528] env[62277]: DEBUG neutronclient.v2_0.client [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=62277) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1725.240938] env[62277]: ERROR nova.compute.manager [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Traceback (most recent call last): [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] result = getattr(controller, method)(*args, **kwargs) [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self._get(image_id) [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] resp, body = self.http_client.get(url, headers=header) [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self.request(url, 'GET', **kwargs) [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self._handle_response(resp) [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] raise exc.from_response(resp, resp.content) [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] During handling of the above exception, another exception occurred: [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Traceback (most recent call last): [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self.driver.spawn(context, instance, image_meta, [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self._fetch_image_if_missing(context, vi) [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1725.240938] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] image_fetch(context, vi, tmp_image_ds_loc) [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] images.fetch_image( [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] metadata = IMAGE_API.get(context, image_ref) [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return session.show(context, image_id, [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] _reraise_translated_image_exception(image_id) [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] raise new_exc.with_traceback(exc_trace) [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] result = getattr(controller, method)(*args, **kwargs) [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self._get(image_id) [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] resp, body = self.http_client.get(url, headers=header) [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self.request(url, 'GET', **kwargs) [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self._handle_response(resp) [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] raise exc.from_response(resp, resp.content) [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] nova.exception.ImageNotAuthorized: Not authorized for image 6f125163-af69-40e9-92ae-3b8a01d74b60. [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] During handling of the above exception, another exception occurred: [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Traceback (most recent call last): [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self._build_and_run_instance(context, instance, image, [ 1725.241782] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] raise exception.RescheduledException( [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] nova.exception.RescheduledException: Build of instance 21bd4623-2b46-43c4-859f-c4d3bf261e1f was re-scheduled: Not authorized for image 6f125163-af69-40e9-92ae-3b8a01d74b60. [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] During handling of the above exception, another exception occurred: [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Traceback (most recent call last): [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] ret = obj(*args, **kwargs) [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] exception_handler_v20(status_code, error_body) [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] raise client_exc(message=error_message, [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Neutron server returns request_ids: ['req-98ac6965-fa46-425d-b7f6-a5555e7cff80'] [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] During handling of the above exception, another exception occurred: [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Traceback (most recent call last): [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self._deallocate_network(context, instance, requested_networks) [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self.network_api.deallocate_for_instance( [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] data = neutron.list_ports(**search_opts) [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] ret = obj(*args, **kwargs) [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self.list('ports', self.ports_path, retrieve_all, [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] ret = obj(*args, **kwargs) [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] for r in self._pagination(collection, path, **params): [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1725.242884] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] res = self.get(path, params=params) [ 1725.243776] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.243776] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] ret = obj(*args, **kwargs) [ 1725.243776] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1725.243776] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self.retry_request("GET", action, body=body, [ 1725.243776] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.243776] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] ret = obj(*args, **kwargs) [ 1725.243776] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1725.243776] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self.do_request(method, action, body=body, [ 1725.243776] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.243776] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] ret = obj(*args, **kwargs) [ 1725.243776] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1725.243776] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self._handle_fault_response(status_code, replybody, resp) [ 1725.243776] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1725.243776] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] raise exception.Unauthorized() [ 1725.243776] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] nova.exception.Unauthorized: Not authorized. [ 1725.243776] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] [ 1725.303173] env[62277]: INFO nova.scheduler.client.report [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Deleted allocations for instance 21bd4623-2b46-43c4-859f-c4d3bf261e1f [ 1725.324417] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3db33c17-3e92-4886-9b81-407134be0ef0 tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Lock "21bd4623-2b46-43c4-859f-c4d3bf261e1f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 592.630s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1725.326599] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cef0743f-f806-4715-8c0f-19c18a124efd tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Lock "21bd4623-2b46-43c4-859f-c4d3bf261e1f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 396.668s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1725.326828] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cef0743f-f806-4715-8c0f-19c18a124efd tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Acquiring lock "21bd4623-2b46-43c4-859f-c4d3bf261e1f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1725.327044] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cef0743f-f806-4715-8c0f-19c18a124efd tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Lock "21bd4623-2b46-43c4-859f-c4d3bf261e1f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1725.327215] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cef0743f-f806-4715-8c0f-19c18a124efd tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Lock "21bd4623-2b46-43c4-859f-c4d3bf261e1f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1725.329215] env[62277]: INFO nova.compute.manager [None req-cef0743f-f806-4715-8c0f-19c18a124efd tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Terminating instance [ 1725.331300] env[62277]: DEBUG nova.compute.manager [None req-cef0743f-f806-4715-8c0f-19c18a124efd tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1725.331429] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-cef0743f-f806-4715-8c0f-19c18a124efd tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1725.331976] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c3c1b556-a723-424f-9635-5eae660274ad {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.341436] env[62277]: DEBUG nova.compute.manager [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1725.348621] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02b8d100-31fb-4390-bd96-8a0ea95f0275 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.379977] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-cef0743f-f806-4715-8c0f-19c18a124efd tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 21bd4623-2b46-43c4-859f-c4d3bf261e1f could not be found. [ 1725.380200] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-cef0743f-f806-4715-8c0f-19c18a124efd tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1725.380375] env[62277]: INFO nova.compute.manager [None req-cef0743f-f806-4715-8c0f-19c18a124efd tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1725.380618] env[62277]: DEBUG oslo.service.loopingcall [None req-cef0743f-f806-4715-8c0f-19c18a124efd tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1725.385097] env[62277]: DEBUG nova.compute.manager [-] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1725.385294] env[62277]: DEBUG nova.network.neutron [-] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1725.399593] env[62277]: DEBUG oslo_concurrency.lockutils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1725.452902] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6562193-bcab-4d0e-b873-04927cd8bcb0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.460563] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-048a94dc-26e0-4593-996a-d0586768305c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.489961] env[62277]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=62277) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1725.490227] env[62277]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-d15bda82-ea79-45a1-bfe7-afbcbfe2d49c'] [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1725.490935] env[62277]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1725.492432] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1725.492432] env[62277]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1725.492432] env[62277]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1725.492432] env[62277]: ERROR oslo.service.loopingcall [ 1725.492432] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c775a653-4e2a-456f-acb8-62d13e6139fe {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.494582] env[62277]: ERROR nova.compute.manager [None req-cef0743f-f806-4715-8c0f-19c18a124efd tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1725.502042] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-125845f3-fb43-4d0c-a479-dba98f3c40bd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.514929] env[62277]: DEBUG nova.compute.provider_tree [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1725.524247] env[62277]: DEBUG nova.scheduler.client.report [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1725.540335] env[62277]: ERROR nova.compute.manager [None req-cef0743f-f806-4715-8c0f-19c18a124efd tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Traceback (most recent call last): [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] ret = obj(*args, **kwargs) [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] exception_handler_v20(status_code, error_body) [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] raise client_exc(message=error_message, [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Neutron server returns request_ids: ['req-d15bda82-ea79-45a1-bfe7-afbcbfe2d49c'] [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] During handling of the above exception, another exception occurred: [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Traceback (most recent call last): [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self._delete_instance(context, instance, bdms) [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self._shutdown_instance(context, instance, bdms) [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self._try_deallocate_network(context, instance, requested_networks) [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] with excutils.save_and_reraise_exception(): [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self.force_reraise() [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] raise self.value [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] _deallocate_network_with_retries() [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return evt.wait() [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1725.540335] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] result = hub.switch() [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self.greenlet.switch() [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] result = func(*self.args, **self.kw) [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] result = f(*args, **kwargs) [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self._deallocate_network( [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self.network_api.deallocate_for_instance( [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] data = neutron.list_ports(**search_opts) [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] ret = obj(*args, **kwargs) [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self.list('ports', self.ports_path, retrieve_all, [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] ret = obj(*args, **kwargs) [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] for r in self._pagination(collection, path, **params): [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] res = self.get(path, params=params) [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] ret = obj(*args, **kwargs) [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self.retry_request("GET", action, body=body, [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] ret = obj(*args, **kwargs) [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] return self.do_request(method, action, body=body, [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] ret = obj(*args, **kwargs) [ 1725.541303] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1725.542011] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] self._handle_fault_response(status_code, replybody, resp) [ 1725.542011] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1725.542011] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1725.542011] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1725.542011] env[62277]: ERROR nova.compute.manager [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] [ 1725.543661] env[62277]: DEBUG oslo_concurrency.lockutils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.431s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1725.544133] env[62277]: DEBUG nova.compute.manager [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1725.546363] env[62277]: DEBUG oslo_concurrency.lockutils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.147s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1725.548042] env[62277]: INFO nova.compute.claims [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1725.580246] env[62277]: DEBUG nova.compute.utils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1725.581575] env[62277]: DEBUG nova.compute.manager [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1725.581757] env[62277]: DEBUG nova.network.neutron [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1725.587236] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cef0743f-f806-4715-8c0f-19c18a124efd tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Lock "21bd4623-2b46-43c4-859f-c4d3bf261e1f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.261s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1725.592467] env[62277]: DEBUG nova.compute.manager [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1725.661353] env[62277]: DEBUG nova.policy [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6f64f6785a6642db82f9b311db985eb5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8d5a2c5c5d084890bfb654be2e938f63', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1725.662982] env[62277]: INFO nova.compute.manager [None req-cef0743f-f806-4715-8c0f-19c18a124efd tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] [instance: 21bd4623-2b46-43c4-859f-c4d3bf261e1f] Successfully reverted task state from None on failure for instance. [ 1725.669632] env[62277]: DEBUG nova.compute.manager [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server [None req-cef0743f-f806-4715-8c0f-19c18a124efd tempest-DeleteServersAdminTestJSON-247075979 tempest-DeleteServersAdminTestJSON-247075979-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-d15bda82-ea79-45a1-bfe7-afbcbfe2d49c'] [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1725.672752] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1725.674142] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1725.675945] env[62277]: ERROR oslo_messaging.rpc.server [ 1725.697451] env[62277]: DEBUG nova.virt.hardware [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1725.697715] env[62277]: DEBUG nova.virt.hardware [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1725.697882] env[62277]: DEBUG nova.virt.hardware [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1725.698087] env[62277]: DEBUG nova.virt.hardware [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1725.698237] env[62277]: DEBUG nova.virt.hardware [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1725.698381] env[62277]: DEBUG nova.virt.hardware [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1725.698585] env[62277]: DEBUG nova.virt.hardware [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1725.698741] env[62277]: DEBUG nova.virt.hardware [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1725.698917] env[62277]: DEBUG nova.virt.hardware [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1725.699091] env[62277]: DEBUG nova.virt.hardware [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1725.699266] env[62277]: DEBUG nova.virt.hardware [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1725.700280] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3566ab0-b7dd-4d9c-bf60-984d476b4ea3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.708446] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba810b26-3dcc-4519-85f2-1d2be6184234 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.854820] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-538ff0f5-cc3d-43e8-b54f-cee0809463b2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.862334] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bfee93d-474e-4b68-b4ca-5a9813e98122 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.895670] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9aa115fc-9943-45af-a3c5-add6c6bec105 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.902885] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af011425-f028-454e-9822-b1f05b30c79d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.916823] env[62277]: DEBUG nova.compute.provider_tree [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1725.924746] env[62277]: DEBUG nova.scheduler.client.report [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1725.939358] env[62277]: DEBUG oslo_concurrency.lockutils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.393s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1725.939823] env[62277]: DEBUG nova.compute.manager [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1725.974707] env[62277]: DEBUG nova.compute.utils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1725.976303] env[62277]: DEBUG nova.compute.manager [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1725.976478] env[62277]: DEBUG nova.network.neutron [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1725.988240] env[62277]: DEBUG nova.compute.manager [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1726.039131] env[62277]: DEBUG nova.network.neutron [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Successfully created port: fa417b8e-802b-47c1-bb9c-beb70d1f671f {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1726.052794] env[62277]: DEBUG nova.policy [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6f64f6785a6642db82f9b311db985eb5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8d5a2c5c5d084890bfb654be2e938f63', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1726.056049] env[62277]: DEBUG nova.compute.manager [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1726.078645] env[62277]: DEBUG nova.virt.hardware [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1726.078811] env[62277]: DEBUG nova.virt.hardware [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1726.078883] env[62277]: DEBUG nova.virt.hardware [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1726.079581] env[62277]: DEBUG nova.virt.hardware [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1726.079790] env[62277]: DEBUG nova.virt.hardware [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1726.079984] env[62277]: DEBUG nova.virt.hardware [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1726.080192] env[62277]: DEBUG nova.virt.hardware [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1726.080305] env[62277]: DEBUG nova.virt.hardware [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1726.082027] env[62277]: DEBUG nova.virt.hardware [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1726.082027] env[62277]: DEBUG nova.virt.hardware [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1726.082027] env[62277]: DEBUG nova.virt.hardware [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1726.082027] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40765309-15f2-4578-9c7b-a6f07be1d30a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1726.089582] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-079a6298-b408-4a31-b202-54264c8002e1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1726.429987] env[62277]: DEBUG nova.network.neutron [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Successfully created port: 78931ff5-5616-49bf-8142-9268951931fe {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1726.708916] env[62277]: DEBUG nova.compute.manager [req-ba93a669-2370-49bd-98e2-ab2b19c19640 req-771d3286-8974-4d9a-9a07-a3a12f9a568c service nova] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Received event network-vif-plugged-fa417b8e-802b-47c1-bb9c-beb70d1f671f {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1726.709156] env[62277]: DEBUG oslo_concurrency.lockutils [req-ba93a669-2370-49bd-98e2-ab2b19c19640 req-771d3286-8974-4d9a-9a07-a3a12f9a568c service nova] Acquiring lock "b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1726.709366] env[62277]: DEBUG oslo_concurrency.lockutils [req-ba93a669-2370-49bd-98e2-ab2b19c19640 req-771d3286-8974-4d9a-9a07-a3a12f9a568c service nova] Lock "b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1726.709530] env[62277]: DEBUG oslo_concurrency.lockutils [req-ba93a669-2370-49bd-98e2-ab2b19c19640 req-771d3286-8974-4d9a-9a07-a3a12f9a568c service nova] Lock "b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1726.709691] env[62277]: DEBUG nova.compute.manager [req-ba93a669-2370-49bd-98e2-ab2b19c19640 req-771d3286-8974-4d9a-9a07-a3a12f9a568c service nova] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] No waiting events found dispatching network-vif-plugged-fa417b8e-802b-47c1-bb9c-beb70d1f671f {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1726.709853] env[62277]: WARNING nova.compute.manager [req-ba93a669-2370-49bd-98e2-ab2b19c19640 req-771d3286-8974-4d9a-9a07-a3a12f9a568c service nova] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Received unexpected event network-vif-plugged-fa417b8e-802b-47c1-bb9c-beb70d1f671f for instance with vm_state building and task_state spawning. [ 1726.802332] env[62277]: DEBUG nova.network.neutron [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Successfully updated port: fa417b8e-802b-47c1-bb9c-beb70d1f671f {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1726.813414] env[62277]: DEBUG oslo_concurrency.lockutils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquiring lock "refresh_cache-b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1726.813566] env[62277]: DEBUG oslo_concurrency.lockutils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquired lock "refresh_cache-b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1726.813723] env[62277]: DEBUG nova.network.neutron [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1726.885874] env[62277]: DEBUG nova.network.neutron [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1727.150197] env[62277]: DEBUG nova.compute.manager [req-a1af065b-6a8d-4a80-80d8-9d51daf68c12 req-6dc8c983-a89a-4fe9-8c81-cbba7d4a30ac service nova] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Received event network-vif-plugged-78931ff5-5616-49bf-8142-9268951931fe {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1727.150423] env[62277]: DEBUG oslo_concurrency.lockutils [req-a1af065b-6a8d-4a80-80d8-9d51daf68c12 req-6dc8c983-a89a-4fe9-8c81-cbba7d4a30ac service nova] Acquiring lock "35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1727.150631] env[62277]: DEBUG oslo_concurrency.lockutils [req-a1af065b-6a8d-4a80-80d8-9d51daf68c12 req-6dc8c983-a89a-4fe9-8c81-cbba7d4a30ac service nova] Lock "35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1727.150797] env[62277]: DEBUG oslo_concurrency.lockutils [req-a1af065b-6a8d-4a80-80d8-9d51daf68c12 req-6dc8c983-a89a-4fe9-8c81-cbba7d4a30ac service nova] Lock "35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1727.150998] env[62277]: DEBUG nova.compute.manager [req-a1af065b-6a8d-4a80-80d8-9d51daf68c12 req-6dc8c983-a89a-4fe9-8c81-cbba7d4a30ac service nova] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] No waiting events found dispatching network-vif-plugged-78931ff5-5616-49bf-8142-9268951931fe {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1727.151182] env[62277]: WARNING nova.compute.manager [req-a1af065b-6a8d-4a80-80d8-9d51daf68c12 req-6dc8c983-a89a-4fe9-8c81-cbba7d4a30ac service nova] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Received unexpected event network-vif-plugged-78931ff5-5616-49bf-8142-9268951931fe for instance with vm_state building and task_state spawning. [ 1727.229020] env[62277]: DEBUG nova.network.neutron [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Successfully updated port: 78931ff5-5616-49bf-8142-9268951931fe {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1727.239120] env[62277]: DEBUG oslo_concurrency.lockutils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquiring lock "refresh_cache-35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1727.239297] env[62277]: DEBUG oslo_concurrency.lockutils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquired lock "refresh_cache-35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1727.239433] env[62277]: DEBUG nova.network.neutron [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1727.279958] env[62277]: DEBUG nova.network.neutron [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1727.290722] env[62277]: DEBUG nova.network.neutron [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Updating instance_info_cache with network_info: [{"id": "fa417b8e-802b-47c1-bb9c-beb70d1f671f", "address": "fa:16:3e:73:82:18", "network": {"id": "cffeb7f4-d5bf-4e82-ae06-10dbdcc8a1f5", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-865065223-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d5a2c5c5d084890bfb654be2e938f63", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ada35c98-01a9-4352-98e4-1d20ba31f928", "external-id": "nsx-vlan-transportzone-242", "segmentation_id": 242, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfa417b8e-80", "ovs_interfaceid": "fa417b8e-802b-47c1-bb9c-beb70d1f671f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1727.302883] env[62277]: DEBUG oslo_concurrency.lockutils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Releasing lock "refresh_cache-b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1727.302883] env[62277]: DEBUG nova.compute.manager [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Instance network_info: |[{"id": "fa417b8e-802b-47c1-bb9c-beb70d1f671f", "address": "fa:16:3e:73:82:18", "network": {"id": "cffeb7f4-d5bf-4e82-ae06-10dbdcc8a1f5", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-865065223-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d5a2c5c5d084890bfb654be2e938f63", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ada35c98-01a9-4352-98e4-1d20ba31f928", "external-id": "nsx-vlan-transportzone-242", "segmentation_id": 242, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfa417b8e-80", "ovs_interfaceid": "fa417b8e-802b-47c1-bb9c-beb70d1f671f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1727.303088] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:73:82:18', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ada35c98-01a9-4352-98e4-1d20ba31f928', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'fa417b8e-802b-47c1-bb9c-beb70d1f671f', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1727.310716] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Creating folder: Project (8d5a2c5c5d084890bfb654be2e938f63). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1727.311235] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c94d2528-e59e-4863-a478-04e61626f6c6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1727.322270] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Created folder: Project (8d5a2c5c5d084890bfb654be2e938f63) in parent group-v297781. [ 1727.322435] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Creating folder: Instances. Parent ref: group-v297866. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1727.324717] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-98ee5978-683b-406c-a757-b55dad1dd452 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1727.336477] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Created folder: Instances in parent group-v297866. [ 1727.336712] env[62277]: DEBUG oslo.service.loopingcall [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1727.336925] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1727.337150] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-fc5f0735-e241-4ebc-b8f2-61cc91a68886 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1727.355834] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1727.355834] env[62277]: value = "task-1405447" [ 1727.355834] env[62277]: _type = "Task" [ 1727.355834] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1727.363398] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405447, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1727.497301] env[62277]: DEBUG nova.network.neutron [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Updating instance_info_cache with network_info: [{"id": "78931ff5-5616-49bf-8142-9268951931fe", "address": "fa:16:3e:32:cc:9c", "network": {"id": "cffeb7f4-d5bf-4e82-ae06-10dbdcc8a1f5", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-865065223-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d5a2c5c5d084890bfb654be2e938f63", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ada35c98-01a9-4352-98e4-1d20ba31f928", "external-id": "nsx-vlan-transportzone-242", "segmentation_id": 242, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap78931ff5-56", "ovs_interfaceid": "78931ff5-5616-49bf-8142-9268951931fe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1727.515939] env[62277]: DEBUG oslo_concurrency.lockutils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Releasing lock "refresh_cache-35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1727.516365] env[62277]: DEBUG nova.compute.manager [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Instance network_info: |[{"id": "78931ff5-5616-49bf-8142-9268951931fe", "address": "fa:16:3e:32:cc:9c", "network": {"id": "cffeb7f4-d5bf-4e82-ae06-10dbdcc8a1f5", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-865065223-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d5a2c5c5d084890bfb654be2e938f63", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ada35c98-01a9-4352-98e4-1d20ba31f928", "external-id": "nsx-vlan-transportzone-242", "segmentation_id": 242, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap78931ff5-56", "ovs_interfaceid": "78931ff5-5616-49bf-8142-9268951931fe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1727.517165] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:32:cc:9c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ada35c98-01a9-4352-98e4-1d20ba31f928', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '78931ff5-5616-49bf-8142-9268951931fe', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1727.525913] env[62277]: DEBUG oslo.service.loopingcall [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1727.526493] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1727.526771] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-dfa99e1a-ddc6-417e-95b1-030b45a22269 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1727.549317] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1727.549317] env[62277]: value = "task-1405448" [ 1727.549317] env[62277]: _type = "Task" [ 1727.549317] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1727.558475] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405448, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1727.865422] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405447, 'name': CreateVM_Task, 'duration_secs': 0.311127} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1727.865594] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1727.866264] env[62277]: DEBUG oslo_concurrency.lockutils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1727.866425] env[62277]: DEBUG oslo_concurrency.lockutils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1727.866743] env[62277]: DEBUG oslo_concurrency.lockutils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1727.866980] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-eeb90eec-e97e-4a73-8f54-f97d17b039fe {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1727.871234] env[62277]: DEBUG oslo_vmware.api [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Waiting for the task: (returnval){ [ 1727.871234] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]527ca152-9703-7239-9c28-9e6a953266bc" [ 1727.871234] env[62277]: _type = "Task" [ 1727.871234] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1727.878478] env[62277]: DEBUG oslo_vmware.api [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]527ca152-9703-7239-9c28-9e6a953266bc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1728.061104] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405448, 'name': CreateVM_Task, 'duration_secs': 0.288074} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1728.061440] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1728.062163] env[62277]: DEBUG oslo_concurrency.lockutils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1728.382081] env[62277]: DEBUG oslo_concurrency.lockutils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1728.382339] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1728.382547] env[62277]: DEBUG oslo_concurrency.lockutils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1728.382756] env[62277]: DEBUG oslo_concurrency.lockutils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1728.383116] env[62277]: DEBUG oslo_concurrency.lockutils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1728.383373] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1b9f5606-d8a6-4a16-acfa-1e316af5d845 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1728.387562] env[62277]: DEBUG oslo_vmware.api [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Waiting for the task: (returnval){ [ 1728.387562] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52822ce2-b39d-1026-8cf8-a5cb0ceb0630" [ 1728.387562] env[62277]: _type = "Task" [ 1728.387562] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1728.394829] env[62277]: DEBUG oslo_vmware.api [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52822ce2-b39d-1026-8cf8-a5cb0ceb0630, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1728.738027] env[62277]: DEBUG nova.compute.manager [req-704ff007-6182-4863-a9f1-cec6460e2cdf req-7b90fa85-7238-4109-b975-06468a449ccf service nova] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Received event network-changed-fa417b8e-802b-47c1-bb9c-beb70d1f671f {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1728.738086] env[62277]: DEBUG nova.compute.manager [req-704ff007-6182-4863-a9f1-cec6460e2cdf req-7b90fa85-7238-4109-b975-06468a449ccf service nova] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Refreshing instance network info cache due to event network-changed-fa417b8e-802b-47c1-bb9c-beb70d1f671f. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1728.738294] env[62277]: DEBUG oslo_concurrency.lockutils [req-704ff007-6182-4863-a9f1-cec6460e2cdf req-7b90fa85-7238-4109-b975-06468a449ccf service nova] Acquiring lock "refresh_cache-b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1728.738437] env[62277]: DEBUG oslo_concurrency.lockutils [req-704ff007-6182-4863-a9f1-cec6460e2cdf req-7b90fa85-7238-4109-b975-06468a449ccf service nova] Acquired lock "refresh_cache-b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1728.738597] env[62277]: DEBUG nova.network.neutron [req-704ff007-6182-4863-a9f1-cec6460e2cdf req-7b90fa85-7238-4109-b975-06468a449ccf service nova] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Refreshing network info cache for port fa417b8e-802b-47c1-bb9c-beb70d1f671f {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1728.896954] env[62277]: DEBUG oslo_concurrency.lockutils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1728.897261] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1728.897475] env[62277]: DEBUG oslo_concurrency.lockutils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1729.192377] env[62277]: DEBUG nova.compute.manager [req-5a54eb61-b2bd-4bd9-9479-5120cdddf960 req-0ec6e8b0-341a-4c61-9434-19555679d46e service nova] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Received event network-changed-78931ff5-5616-49bf-8142-9268951931fe {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1729.192620] env[62277]: DEBUG nova.compute.manager [req-5a54eb61-b2bd-4bd9-9479-5120cdddf960 req-0ec6e8b0-341a-4c61-9434-19555679d46e service nova] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Refreshing instance network info cache due to event network-changed-78931ff5-5616-49bf-8142-9268951931fe. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1729.192823] env[62277]: DEBUG oslo_concurrency.lockutils [req-5a54eb61-b2bd-4bd9-9479-5120cdddf960 req-0ec6e8b0-341a-4c61-9434-19555679d46e service nova] Acquiring lock "refresh_cache-35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1729.192974] env[62277]: DEBUG oslo_concurrency.lockutils [req-5a54eb61-b2bd-4bd9-9479-5120cdddf960 req-0ec6e8b0-341a-4c61-9434-19555679d46e service nova] Acquired lock "refresh_cache-35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1729.193415] env[62277]: DEBUG nova.network.neutron [req-5a54eb61-b2bd-4bd9-9479-5120cdddf960 req-0ec6e8b0-341a-4c61-9434-19555679d46e service nova] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Refreshing network info cache for port 78931ff5-5616-49bf-8142-9268951931fe {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1729.247259] env[62277]: DEBUG nova.network.neutron [req-704ff007-6182-4863-a9f1-cec6460e2cdf req-7b90fa85-7238-4109-b975-06468a449ccf service nova] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Updated VIF entry in instance network info cache for port fa417b8e-802b-47c1-bb9c-beb70d1f671f. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1729.247604] env[62277]: DEBUG nova.network.neutron [req-704ff007-6182-4863-a9f1-cec6460e2cdf req-7b90fa85-7238-4109-b975-06468a449ccf service nova] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Updating instance_info_cache with network_info: [{"id": "fa417b8e-802b-47c1-bb9c-beb70d1f671f", "address": "fa:16:3e:73:82:18", "network": {"id": "cffeb7f4-d5bf-4e82-ae06-10dbdcc8a1f5", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-865065223-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d5a2c5c5d084890bfb654be2e938f63", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ada35c98-01a9-4352-98e4-1d20ba31f928", "external-id": "nsx-vlan-transportzone-242", "segmentation_id": 242, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfa417b8e-80", "ovs_interfaceid": "fa417b8e-802b-47c1-bb9c-beb70d1f671f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1729.256588] env[62277]: DEBUG oslo_concurrency.lockutils [req-704ff007-6182-4863-a9f1-cec6460e2cdf req-7b90fa85-7238-4109-b975-06468a449ccf service nova] Releasing lock "refresh_cache-b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1729.368990] env[62277]: DEBUG oslo_concurrency.lockutils [None req-048456fb-7e13-412f-a5bf-3a6c3c06d6e3 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquiring lock "35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1729.739821] env[62277]: DEBUG nova.network.neutron [req-5a54eb61-b2bd-4bd9-9479-5120cdddf960 req-0ec6e8b0-341a-4c61-9434-19555679d46e service nova] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Updated VIF entry in instance network info cache for port 78931ff5-5616-49bf-8142-9268951931fe. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1729.740281] env[62277]: DEBUG nova.network.neutron [req-5a54eb61-b2bd-4bd9-9479-5120cdddf960 req-0ec6e8b0-341a-4c61-9434-19555679d46e service nova] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Updating instance_info_cache with network_info: [{"id": "78931ff5-5616-49bf-8142-9268951931fe", "address": "fa:16:3e:32:cc:9c", "network": {"id": "cffeb7f4-d5bf-4e82-ae06-10dbdcc8a1f5", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-865065223-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8d5a2c5c5d084890bfb654be2e938f63", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ada35c98-01a9-4352-98e4-1d20ba31f928", "external-id": "nsx-vlan-transportzone-242", "segmentation_id": 242, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap78931ff5-56", "ovs_interfaceid": "78931ff5-5616-49bf-8142-9268951931fe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1729.749140] env[62277]: DEBUG oslo_concurrency.lockutils [req-5a54eb61-b2bd-4bd9-9479-5120cdddf960 req-0ec6e8b0-341a-4c61-9434-19555679d46e service nova] Releasing lock "refresh_cache-35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1732.003786] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Acquiring lock "8b9ef530-e79f-4cd4-8a88-83871ed65f90" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1732.004105] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Lock "8b9ef530-e79f-4cd4-8a88-83871ed65f90" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1732.236953] env[62277]: DEBUG oslo_concurrency.lockutils [None req-986881fe-22f5-4969-b560-0c069535f231 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Acquiring lock "9bf5cf30-5142-4d51-b5d9-1bbfb9eedce8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1732.237222] env[62277]: DEBUG oslo_concurrency.lockutils [None req-986881fe-22f5-4969-b560-0c069535f231 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Lock "9bf5cf30-5142-4d51-b5d9-1bbfb9eedce8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1763.169357] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1764.164390] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1766.169869] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1766.170220] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 1766.170220] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 1766.195651] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1766.195821] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1766.195982] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1766.196147] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1766.196274] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1766.196397] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 13959890-87a1-45ba-98de-621373e265e7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1766.196526] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1766.196648] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1766.196768] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1766.196886] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1766.197013] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 1767.168674] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1767.169103] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1768.168874] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1770.169763] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1770.170037] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1770.170187] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 1770.170356] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1770.183191] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1770.183406] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1770.183611] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1770.183764] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1770.184911] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27a27020-5e20-4b62-a9c1-38a1174c822a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1770.193735] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb52d67e-88e2-4edc-a7b8-f34e27a9ad12 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1770.207523] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44702abd-233f-43b7-bec5-ceb2c7ede6c7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1770.213555] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f43f2ac2-4a93-44cf-94a7-2a3d16bd9eb8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1770.243182] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181423MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1770.243182] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1770.243368] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1770.318955] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6d759045-e1fc-43ea-a882-1ead769b6d29 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1770.319205] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 32bed248-06d5-47a1-b281-47921d99dbf6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1770.319377] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8d00162c-7379-48b6-841b-f802db2582db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1770.319531] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 900160c8-a715-45a4-8709-b314fc3216d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1770.319661] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 63267d5c-d004-41c1-866a-75b9e37521b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1770.319781] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 13959890-87a1-45ba-98de-621373e265e7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1770.319896] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance a7cc7e45-8567-4699-af83-624b1c7c5c64 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1770.320027] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 42005809-1926-44b2-8ef6-3b6cb28a4020 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1770.320150] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1770.320261] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1770.331463] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 163eb4e7-33f8-4674-8a3f-5094356e250d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1770.343687] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 2fdebb33-a32b-4753-aa2e-adfc4b252fac has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1770.354010] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 5ed6905b-ddb5-4517-a8dc-ee8e00b53db0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1770.363846] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9f8669ed-f65f-4472-9ef6-01953c48466b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1770.374025] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 3b8f2c10-5dce-44d0-bb6b-939afc01e44b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1770.384917] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 400beb27-a709-4ef4-851e-5caaab9ca60b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1770.395206] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 272391f1-a349-4525-91ec-75b3ba7aeb1c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1770.404257] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 227394fe-d0c6-48c8-aed2-433ce34e34f8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1770.414736] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9f7d4431-d5ea-4f9b-888b-77a6a7772047 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1770.424110] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8b9ef530-e79f-4cd4-8a88-83871ed65f90 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1770.433679] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9bf5cf30-5142-4d51-b5d9-1bbfb9eedce8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1770.433913] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1770.434071] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1770.655945] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d8b02c7-d9eb-4f98-9de8-411c4d90ac0e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1770.663322] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-441a286b-e17a-4000-89fe-bbd2a94ed2e5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1770.693440] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0c3556d-46b4-45fd-842a-3e4cde78d0a1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1770.700400] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36db9197-99e2-44a9-be31-ec482c4faeb4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1770.712749] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1770.721089] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1770.733778] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1770.733991] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.491s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1773.511204] env[62277]: WARNING oslo_vmware.rw_handles [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1773.511204] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1773.511204] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1773.511204] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1773.511204] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1773.511204] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1773.511204] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1773.511204] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1773.511204] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1773.511204] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1773.511204] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1773.511204] env[62277]: ERROR oslo_vmware.rw_handles [ 1773.511746] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/4e751d7f-8f17-4bfe-992d-759bffb545f0/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1773.513386] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1773.513629] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Copying Virtual Disk [datastore2] vmware_temp/4e751d7f-8f17-4bfe-992d-759bffb545f0/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/4e751d7f-8f17-4bfe-992d-759bffb545f0/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1773.513906] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-10840293-a2ee-43b6-b4a5-3fff773775a2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1773.521373] env[62277]: DEBUG oslo_vmware.api [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Waiting for the task: (returnval){ [ 1773.521373] env[62277]: value = "task-1405449" [ 1773.521373] env[62277]: _type = "Task" [ 1773.521373] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1773.529195] env[62277]: DEBUG oslo_vmware.api [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Task: {'id': task-1405449, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1774.032211] env[62277]: DEBUG oslo_vmware.exceptions [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1774.032496] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1774.033134] env[62277]: ERROR nova.compute.manager [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1774.033134] env[62277]: Faults: ['InvalidArgument'] [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Traceback (most recent call last): [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] yield resources [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] self.driver.spawn(context, instance, image_meta, [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] self._fetch_image_if_missing(context, vi) [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] image_cache(vi, tmp_image_ds_loc) [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] vm_util.copy_virtual_disk( [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] session._wait_for_task(vmdk_copy_task) [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] return self.wait_for_task(task_ref) [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] return evt.wait() [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] result = hub.switch() [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] return self.greenlet.switch() [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] self.f(*self.args, **self.kw) [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] raise exceptions.translate_fault(task_info.error) [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Faults: ['InvalidArgument'] [ 1774.033134] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] [ 1774.033908] env[62277]: INFO nova.compute.manager [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Terminating instance [ 1774.035633] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1774.035633] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1774.035633] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9ffdea0b-a612-4565-8738-415f6dbb57f3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.037713] env[62277]: DEBUG nova.compute.manager [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1774.037897] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1774.038692] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fb436fd-4c55-460d-8bd1-f9c2e07bb3ea {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.045693] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1774.046683] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-fbf4f011-ea27-44d0-b677-5a2dbfd0a9b7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.048025] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1774.048230] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1774.048872] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8aea8fe5-dab5-4121-b7f8-6a1dc777dce2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.053722] env[62277]: DEBUG oslo_vmware.api [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Waiting for the task: (returnval){ [ 1774.053722] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52e7d6bd-44d6-8385-61ad-b8d0ea535eca" [ 1774.053722] env[62277]: _type = "Task" [ 1774.053722] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1774.060786] env[62277]: DEBUG oslo_vmware.api [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52e7d6bd-44d6-8385-61ad-b8d0ea535eca, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1774.116131] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1774.116360] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1774.116533] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Deleting the datastore file [datastore2] 6d759045-e1fc-43ea-a882-1ead769b6d29 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1774.116808] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e8fc181f-69cc-405a-ae95-1b9ece3108a1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.122911] env[62277]: DEBUG oslo_vmware.api [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Waiting for the task: (returnval){ [ 1774.122911] env[62277]: value = "task-1405451" [ 1774.122911] env[62277]: _type = "Task" [ 1774.122911] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1774.130391] env[62277]: DEBUG oslo_vmware.api [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Task: {'id': task-1405451, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1774.564166] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1774.564435] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Creating directory with path [datastore2] vmware_temp/0f5b46fd-b48a-45e3-91cb-ca3076ebd654/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1774.564664] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-544e4ec7-efc0-490b-b65b-1a496ae98c44 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.576968] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Created directory with path [datastore2] vmware_temp/0f5b46fd-b48a-45e3-91cb-ca3076ebd654/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1774.577188] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Fetch image to [datastore2] vmware_temp/0f5b46fd-b48a-45e3-91cb-ca3076ebd654/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1774.577380] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/0f5b46fd-b48a-45e3-91cb-ca3076ebd654/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1774.578149] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15ead280-f236-47af-a3bc-86ff73f6e909 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.584630] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5015abd5-9062-44ee-9b87-74c8baf9ae1f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.594612] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccbf8682-2a8b-499a-8faa-bf3c62c1aa8e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.627797] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38fcc0fd-b277-4431-9637-10f049743914 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.635063] env[62277]: DEBUG oslo_vmware.api [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Task: {'id': task-1405451, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07518} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1774.636535] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1774.636723] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1774.636893] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1774.637075] env[62277]: INFO nova.compute.manager [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1774.639212] env[62277]: DEBUG nova.compute.claims [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1774.639418] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1774.639593] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1774.642204] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7fb49ffa-4864-4fc2-bc7b-264baed632b2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.664464] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1774.722060] env[62277]: DEBUG oslo_vmware.rw_handles [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0f5b46fd-b48a-45e3-91cb-ca3076ebd654/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1774.788128] env[62277]: DEBUG oslo_vmware.rw_handles [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1774.788128] env[62277]: DEBUG oslo_vmware.rw_handles [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0f5b46fd-b48a-45e3-91cb-ca3076ebd654/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1775.009517] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e219402-4968-4bb7-ac47-4ef874debd07 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1775.017260] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d87b368f-a4dd-4fec-b92e-3b066cdf312c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1775.046880] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3ddb09b-0e06-4be8-b974-62ae3c3cd3a4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1775.053612] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cc07f46-6a56-4481-81a3-99bab96c5cdb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1775.066271] env[62277]: DEBUG nova.compute.provider_tree [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1775.074381] env[62277]: DEBUG nova.scheduler.client.report [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1775.088349] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.449s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1775.088851] env[62277]: ERROR nova.compute.manager [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1775.088851] env[62277]: Faults: ['InvalidArgument'] [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Traceback (most recent call last): [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] self.driver.spawn(context, instance, image_meta, [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] self._fetch_image_if_missing(context, vi) [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] image_cache(vi, tmp_image_ds_loc) [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] vm_util.copy_virtual_disk( [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] session._wait_for_task(vmdk_copy_task) [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] return self.wait_for_task(task_ref) [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] return evt.wait() [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] result = hub.switch() [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] return self.greenlet.switch() [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] self.f(*self.args, **self.kw) [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] raise exceptions.translate_fault(task_info.error) [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Faults: ['InvalidArgument'] [ 1775.088851] env[62277]: ERROR nova.compute.manager [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] [ 1775.089695] env[62277]: DEBUG nova.compute.utils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1775.090868] env[62277]: DEBUG nova.compute.manager [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Build of instance 6d759045-e1fc-43ea-a882-1ead769b6d29 was re-scheduled: A specified parameter was not correct: fileType [ 1775.090868] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1775.091234] env[62277]: DEBUG nova.compute.manager [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1775.091406] env[62277]: DEBUG nova.compute.manager [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1775.091570] env[62277]: DEBUG nova.compute.manager [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1775.091726] env[62277]: DEBUG nova.network.neutron [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1775.487478] env[62277]: DEBUG nova.network.neutron [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1775.500494] env[62277]: INFO nova.compute.manager [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Took 0.41 seconds to deallocate network for instance. [ 1775.620936] env[62277]: INFO nova.scheduler.client.report [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Deleted allocations for instance 6d759045-e1fc-43ea-a882-1ead769b6d29 [ 1775.644191] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7553aadb-2f8a-4041-b5a2-641f1d9bf6aa tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "6d759045-e1fc-43ea-a882-1ead769b6d29" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 585.373s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1775.645405] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ce806ea2-0ed1-4e6b-84a6-51b6227c5ad3 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "6d759045-e1fc-43ea-a882-1ead769b6d29" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 388.810s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1775.645612] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ce806ea2-0ed1-4e6b-84a6-51b6227c5ad3 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "6d759045-e1fc-43ea-a882-1ead769b6d29-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1775.645816] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ce806ea2-0ed1-4e6b-84a6-51b6227c5ad3 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "6d759045-e1fc-43ea-a882-1ead769b6d29-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1775.645980] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ce806ea2-0ed1-4e6b-84a6-51b6227c5ad3 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "6d759045-e1fc-43ea-a882-1ead769b6d29-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1775.648825] env[62277]: INFO nova.compute.manager [None req-ce806ea2-0ed1-4e6b-84a6-51b6227c5ad3 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Terminating instance [ 1775.651142] env[62277]: DEBUG nova.compute.manager [None req-ce806ea2-0ed1-4e6b-84a6-51b6227c5ad3 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1775.651340] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-ce806ea2-0ed1-4e6b-84a6-51b6227c5ad3 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1775.651603] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7f48bb63-1be6-4fc8-b485-cf16a8100f0f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1775.660731] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfced0d3-1ed7-4519-b21a-0acac4db4d5c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1775.672760] env[62277]: DEBUG nova.compute.manager [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1775.694162] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-ce806ea2-0ed1-4e6b-84a6-51b6227c5ad3 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6d759045-e1fc-43ea-a882-1ead769b6d29 could not be found. [ 1775.694819] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-ce806ea2-0ed1-4e6b-84a6-51b6227c5ad3 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1775.694819] env[62277]: INFO nova.compute.manager [None req-ce806ea2-0ed1-4e6b-84a6-51b6227c5ad3 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1775.694819] env[62277]: DEBUG oslo.service.loopingcall [None req-ce806ea2-0ed1-4e6b-84a6-51b6227c5ad3 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1775.695096] env[62277]: DEBUG nova.compute.manager [-] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1775.695159] env[62277]: DEBUG nova.network.neutron [-] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1775.720390] env[62277]: DEBUG nova.network.neutron [-] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1775.724452] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1775.724676] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1775.726179] env[62277]: INFO nova.compute.claims [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1775.729558] env[62277]: INFO nova.compute.manager [-] [instance: 6d759045-e1fc-43ea-a882-1ead769b6d29] Took 0.03 seconds to deallocate network for instance. [ 1775.820457] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ce806ea2-0ed1-4e6b-84a6-51b6227c5ad3 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "6d759045-e1fc-43ea-a882-1ead769b6d29" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.175s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1776.003894] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c77b7eb-1b3b-4156-b22f-2f5e2513a96c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1776.011934] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22d4b308-f8ea-4861-974e-f5b673c67d30 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1776.040849] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee8205c0-a207-4c9d-8f75-ff6623dd5e73 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1776.047692] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56dfe42e-9e88-478e-8928-8d14b2626c4e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1776.060304] env[62277]: DEBUG nova.compute.provider_tree [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1776.069055] env[62277]: DEBUG nova.scheduler.client.report [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1776.084427] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.360s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1776.084868] env[62277]: DEBUG nova.compute.manager [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1776.117315] env[62277]: DEBUG nova.compute.utils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1776.118701] env[62277]: DEBUG nova.compute.manager [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1776.118871] env[62277]: DEBUG nova.network.neutron [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1776.127123] env[62277]: DEBUG nova.compute.manager [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1776.175752] env[62277]: DEBUG nova.policy [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3a834d1a58b94907bc6944154314dce9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '24482eabb41e4102a26c9e7576a49c33', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1776.190781] env[62277]: DEBUG nova.compute.manager [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1776.216752] env[62277]: DEBUG nova.virt.hardware [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1776.216992] env[62277]: DEBUG nova.virt.hardware [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1776.217158] env[62277]: DEBUG nova.virt.hardware [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1776.217345] env[62277]: DEBUG nova.virt.hardware [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1776.217541] env[62277]: DEBUG nova.virt.hardware [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1776.217648] env[62277]: DEBUG nova.virt.hardware [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1776.217819] env[62277]: DEBUG nova.virt.hardware [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1776.217970] env[62277]: DEBUG nova.virt.hardware [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1776.218165] env[62277]: DEBUG nova.virt.hardware [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1776.218316] env[62277]: DEBUG nova.virt.hardware [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1776.218481] env[62277]: DEBUG nova.virt.hardware [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1776.219336] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c25680d-6417-40be-b8d0-5f61605ea6d9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1776.227036] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a7f925f-858f-489b-b067-1dd674c9a5d3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1776.592498] env[62277]: DEBUG nova.network.neutron [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Successfully created port: 0c28d2e1-fbd5-4d08-80b6-08d75559aa69 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1777.307298] env[62277]: DEBUG nova.network.neutron [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Successfully updated port: 0c28d2e1-fbd5-4d08-80b6-08d75559aa69 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1777.318011] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "refresh_cache-163eb4e7-33f8-4674-8a3f-5094356e250d" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1777.318337] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquired lock "refresh_cache-163eb4e7-33f8-4674-8a3f-5094356e250d" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1777.318337] env[62277]: DEBUG nova.network.neutron [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1777.356164] env[62277]: DEBUG nova.network.neutron [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1777.527316] env[62277]: DEBUG nova.network.neutron [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Updating instance_info_cache with network_info: [{"id": "0c28d2e1-fbd5-4d08-80b6-08d75559aa69", "address": "fa:16:3e:90:02:fa", "network": {"id": "83a53f5b-0798-4c93-9294-0cdb526dc3ca", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1943573639-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "24482eabb41e4102a26c9e7576a49c33", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f85835c8-5d0c-4b2f-97c4-6c4006580f79", "external-id": "nsx-vlan-transportzone-245", "segmentation_id": 245, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0c28d2e1-fb", "ovs_interfaceid": "0c28d2e1-fbd5-4d08-80b6-08d75559aa69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1777.542032] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Releasing lock "refresh_cache-163eb4e7-33f8-4674-8a3f-5094356e250d" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1777.542389] env[62277]: DEBUG nova.compute.manager [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Instance network_info: |[{"id": "0c28d2e1-fbd5-4d08-80b6-08d75559aa69", "address": "fa:16:3e:90:02:fa", "network": {"id": "83a53f5b-0798-4c93-9294-0cdb526dc3ca", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1943573639-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "24482eabb41e4102a26c9e7576a49c33", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f85835c8-5d0c-4b2f-97c4-6c4006580f79", "external-id": "nsx-vlan-transportzone-245", "segmentation_id": 245, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0c28d2e1-fb", "ovs_interfaceid": "0c28d2e1-fbd5-4d08-80b6-08d75559aa69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1777.542785] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:90:02:fa', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f85835c8-5d0c-4b2f-97c4-6c4006580f79', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0c28d2e1-fbd5-4d08-80b6-08d75559aa69', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1777.550469] env[62277]: DEBUG oslo.service.loopingcall [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1777.551801] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1777.552837] env[62277]: DEBUG nova.compute.manager [req-ba6fb88a-1d46-4287-a952-ee6e35c19440 req-9c34a567-e1d0-4560-92b4-62f5645009d5 service nova] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Received event network-vif-plugged-0c28d2e1-fbd5-4d08-80b6-08d75559aa69 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1777.553043] env[62277]: DEBUG oslo_concurrency.lockutils [req-ba6fb88a-1d46-4287-a952-ee6e35c19440 req-9c34a567-e1d0-4560-92b4-62f5645009d5 service nova] Acquiring lock "163eb4e7-33f8-4674-8a3f-5094356e250d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1777.553253] env[62277]: DEBUG oslo_concurrency.lockutils [req-ba6fb88a-1d46-4287-a952-ee6e35c19440 req-9c34a567-e1d0-4560-92b4-62f5645009d5 service nova] Lock "163eb4e7-33f8-4674-8a3f-5094356e250d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1777.553420] env[62277]: DEBUG oslo_concurrency.lockutils [req-ba6fb88a-1d46-4287-a952-ee6e35c19440 req-9c34a567-e1d0-4560-92b4-62f5645009d5 service nova] Lock "163eb4e7-33f8-4674-8a3f-5094356e250d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1777.553584] env[62277]: DEBUG nova.compute.manager [req-ba6fb88a-1d46-4287-a952-ee6e35c19440 req-9c34a567-e1d0-4560-92b4-62f5645009d5 service nova] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] No waiting events found dispatching network-vif-plugged-0c28d2e1-fbd5-4d08-80b6-08d75559aa69 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1777.553739] env[62277]: WARNING nova.compute.manager [req-ba6fb88a-1d46-4287-a952-ee6e35c19440 req-9c34a567-e1d0-4560-92b4-62f5645009d5 service nova] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Received unexpected event network-vif-plugged-0c28d2e1-fbd5-4d08-80b6-08d75559aa69 for instance with vm_state building and task_state spawning. [ 1777.553894] env[62277]: DEBUG nova.compute.manager [req-ba6fb88a-1d46-4287-a952-ee6e35c19440 req-9c34a567-e1d0-4560-92b4-62f5645009d5 service nova] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Received event network-changed-0c28d2e1-fbd5-4d08-80b6-08d75559aa69 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1777.554052] env[62277]: DEBUG nova.compute.manager [req-ba6fb88a-1d46-4287-a952-ee6e35c19440 req-9c34a567-e1d0-4560-92b4-62f5645009d5 service nova] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Refreshing instance network info cache due to event network-changed-0c28d2e1-fbd5-4d08-80b6-08d75559aa69. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1777.554235] env[62277]: DEBUG oslo_concurrency.lockutils [req-ba6fb88a-1d46-4287-a952-ee6e35c19440 req-9c34a567-e1d0-4560-92b4-62f5645009d5 service nova] Acquiring lock "refresh_cache-163eb4e7-33f8-4674-8a3f-5094356e250d" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1777.554369] env[62277]: DEBUG oslo_concurrency.lockutils [req-ba6fb88a-1d46-4287-a952-ee6e35c19440 req-9c34a567-e1d0-4560-92b4-62f5645009d5 service nova] Acquired lock "refresh_cache-163eb4e7-33f8-4674-8a3f-5094356e250d" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1777.554520] env[62277]: DEBUG nova.network.neutron [req-ba6fb88a-1d46-4287-a952-ee6e35c19440 req-9c34a567-e1d0-4560-92b4-62f5645009d5 service nova] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Refreshing network info cache for port 0c28d2e1-fbd5-4d08-80b6-08d75559aa69 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1777.555829] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-63ba4680-5050-48c9-9659-8e2bd1360f48 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1777.579443] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1777.579443] env[62277]: value = "task-1405452" [ 1777.579443] env[62277]: _type = "Task" [ 1777.579443] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1777.587274] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405452, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1777.893628] env[62277]: DEBUG nova.network.neutron [req-ba6fb88a-1d46-4287-a952-ee6e35c19440 req-9c34a567-e1d0-4560-92b4-62f5645009d5 service nova] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Updated VIF entry in instance network info cache for port 0c28d2e1-fbd5-4d08-80b6-08d75559aa69. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1777.893983] env[62277]: DEBUG nova.network.neutron [req-ba6fb88a-1d46-4287-a952-ee6e35c19440 req-9c34a567-e1d0-4560-92b4-62f5645009d5 service nova] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Updating instance_info_cache with network_info: [{"id": "0c28d2e1-fbd5-4d08-80b6-08d75559aa69", "address": "fa:16:3e:90:02:fa", "network": {"id": "83a53f5b-0798-4c93-9294-0cdb526dc3ca", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1943573639-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "24482eabb41e4102a26c9e7576a49c33", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f85835c8-5d0c-4b2f-97c4-6c4006580f79", "external-id": "nsx-vlan-transportzone-245", "segmentation_id": 245, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0c28d2e1-fb", "ovs_interfaceid": "0c28d2e1-fbd5-4d08-80b6-08d75559aa69", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1777.906518] env[62277]: DEBUG oslo_concurrency.lockutils [req-ba6fb88a-1d46-4287-a952-ee6e35c19440 req-9c34a567-e1d0-4560-92b4-62f5645009d5 service nova] Releasing lock "refresh_cache-163eb4e7-33f8-4674-8a3f-5094356e250d" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1778.090059] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405452, 'name': CreateVM_Task, 'duration_secs': 0.316142} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1778.090059] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1778.090601] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1778.090771] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1778.091107] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1778.091376] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c42a071d-1682-47f0-9273-3a042ff6fb42 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1778.095895] env[62277]: DEBUG oslo_vmware.api [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Waiting for the task: (returnval){ [ 1778.095895] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5281fefc-50dd-8765-d8fe-28fed258b116" [ 1778.095895] env[62277]: _type = "Task" [ 1778.095895] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1778.103653] env[62277]: DEBUG oslo_vmware.api [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5281fefc-50dd-8765-d8fe-28fed258b116, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1778.607917] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1778.608305] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1778.608435] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1781.262637] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8b141e2a-510e-45d7-95d9-1549251e1256 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "163eb4e7-33f8-4674-8a3f-5094356e250d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1807.511969] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "c4c22c8a-4300-45ce-8484-77c638f7bbc5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1807.512350] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "c4c22c8a-4300-45ce-8484-77c638f7bbc5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1818.063965] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0493c29c-f7e7-4494-8adf-81be904735b0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Acquiring lock "97a1dad9-a665-42ca-b85e-5fef59ab80bf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1818.064248] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0493c29c-f7e7-4494-8adf-81be904735b0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Lock "97a1dad9-a665-42ca-b85e-5fef59ab80bf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1824.169398] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1824.169809] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Cleaning up deleted instances with incomplete migration {{(pid=62277) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11234}} [ 1824.380735] env[62277]: WARNING oslo_vmware.rw_handles [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1824.380735] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1824.380735] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1824.380735] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1824.380735] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1824.380735] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1824.380735] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1824.380735] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1824.380735] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1824.380735] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1824.380735] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1824.380735] env[62277]: ERROR oslo_vmware.rw_handles [ 1824.381231] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/0f5b46fd-b48a-45e3-91cb-ca3076ebd654/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1824.383062] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1824.383306] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Copying Virtual Disk [datastore2] vmware_temp/0f5b46fd-b48a-45e3-91cb-ca3076ebd654/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/0f5b46fd-b48a-45e3-91cb-ca3076ebd654/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1824.383592] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-bf4b1c1c-245f-4c3f-9663-d3c5d6827c38 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1824.391603] env[62277]: DEBUG oslo_vmware.api [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Waiting for the task: (returnval){ [ 1824.391603] env[62277]: value = "task-1405453" [ 1824.391603] env[62277]: _type = "Task" [ 1824.391603] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1824.399431] env[62277]: DEBUG oslo_vmware.api [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Task: {'id': task-1405453, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1824.902388] env[62277]: DEBUG oslo_vmware.exceptions [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1824.902676] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1824.903228] env[62277]: ERROR nova.compute.manager [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1824.903228] env[62277]: Faults: ['InvalidArgument'] [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Traceback (most recent call last): [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] yield resources [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] self.driver.spawn(context, instance, image_meta, [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] self._fetch_image_if_missing(context, vi) [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] image_cache(vi, tmp_image_ds_loc) [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] vm_util.copy_virtual_disk( [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] session._wait_for_task(vmdk_copy_task) [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] return self.wait_for_task(task_ref) [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] return evt.wait() [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] result = hub.switch() [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] return self.greenlet.switch() [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] self.f(*self.args, **self.kw) [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] raise exceptions.translate_fault(task_info.error) [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Faults: ['InvalidArgument'] [ 1824.903228] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] [ 1824.904036] env[62277]: INFO nova.compute.manager [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Terminating instance [ 1824.905131] env[62277]: DEBUG oslo_concurrency.lockutils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1824.905349] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1824.905955] env[62277]: DEBUG nova.compute.manager [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1824.906155] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1824.906444] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d38a782f-9f56-48e0-b2a7-5a70bab2e221 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1824.908801] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50bdc012-ebcb-4e68-a82c-7409a0222e24 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1824.916595] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1824.916795] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-fe720b9a-db03-4bab-a3a1-97911f677b85 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1824.918895] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1824.919071] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1824.920072] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5e7c51a8-cadc-48a6-835e-eae323bcf4a0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1824.924422] env[62277]: DEBUG oslo_vmware.api [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Waiting for the task: (returnval){ [ 1824.924422] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52438aa9-dc51-d3ab-8b62-0f6a085fd372" [ 1824.924422] env[62277]: _type = "Task" [ 1824.924422] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1824.931437] env[62277]: DEBUG oslo_vmware.api [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52438aa9-dc51-d3ab-8b62-0f6a085fd372, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1824.988195] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1824.988496] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1824.988689] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Deleting the datastore file [datastore2] 32bed248-06d5-47a1-b281-47921d99dbf6 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1824.989231] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-45dec563-2729-4b41-a0c2-ef4dbfbe1c3d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1824.994810] env[62277]: DEBUG oslo_vmware.api [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Waiting for the task: (returnval){ [ 1824.994810] env[62277]: value = "task-1405455" [ 1824.994810] env[62277]: _type = "Task" [ 1824.994810] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1825.002303] env[62277]: DEBUG oslo_vmware.api [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Task: {'id': task-1405455, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1825.179251] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1825.179572] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1825.179659] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Cleaning up deleted instances {{(pid=62277) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11196}} [ 1825.190074] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] There are 0 instances to clean {{(pid=62277) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11205}} [ 1825.434718] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1825.434986] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Creating directory with path [datastore2] vmware_temp/8cf51728-8e0f-475f-8926-eca13b55895f/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1825.435221] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f3fc0243-3a7b-4472-84e3-70e68aaee30c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1825.446225] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Created directory with path [datastore2] vmware_temp/8cf51728-8e0f-475f-8926-eca13b55895f/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1825.446940] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Fetch image to [datastore2] vmware_temp/8cf51728-8e0f-475f-8926-eca13b55895f/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1825.446940] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/8cf51728-8e0f-475f-8926-eca13b55895f/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1825.447290] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1d79983-263c-446e-a3af-6a744fa7d65e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1825.454303] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-affe0ab1-419c-46ee-8497-119aadebc530 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1825.463146] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-190dcccc-8c52-42fb-9ba0-8f4a438b9d73 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1825.492572] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-732e411b-b06d-48c8-bc8f-7ae68b1716d0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1825.503640] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ef851ebb-35af-4620-899d-5f8cbdb05f76 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1825.505341] env[62277]: DEBUG oslo_vmware.api [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Task: {'id': task-1405455, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074466} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1825.505539] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1825.505743] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1825.505880] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1825.506059] env[62277]: INFO nova.compute.manager [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1825.508578] env[62277]: DEBUG nova.compute.claims [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1825.508749] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1825.508963] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1825.526443] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1825.579183] env[62277]: DEBUG oslo_vmware.rw_handles [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8cf51728-8e0f-475f-8926-eca13b55895f/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1825.641921] env[62277]: DEBUG oslo_vmware.rw_handles [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1825.642178] env[62277]: DEBUG oslo_vmware.rw_handles [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8cf51728-8e0f-475f-8926-eca13b55895f/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1825.817479] env[62277]: DEBUG oslo_concurrency.lockutils [None req-28c38721-5ec2-400f-8af8-69dde787e0ab tempest-ListImageFiltersTestJSON-1819822959 tempest-ListImageFiltersTestJSON-1819822959-project-member] Acquiring lock "7748e3a1-6adc-4482-90f9-a3816a224272" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1825.817479] env[62277]: DEBUG oslo_concurrency.lockutils [None req-28c38721-5ec2-400f-8af8-69dde787e0ab tempest-ListImageFiltersTestJSON-1819822959 tempest-ListImageFiltersTestJSON-1819822959-project-member] Lock "7748e3a1-6adc-4482-90f9-a3816a224272" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1825.863920] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97122200-557f-42fc-a83e-53f095b88c09 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1825.873091] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7337f04-cbf6-4be4-89ed-825f323fa93c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1825.904011] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-453b5aa4-b26c-4092-a02b-e5bcecdf2033 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1825.911395] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ad10480-139e-45fe-b4eb-4edfd440b969 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1825.924365] env[62277]: DEBUG nova.compute.provider_tree [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1825.939957] env[62277]: DEBUG nova.scheduler.client.report [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1825.959361] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.450s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1825.959900] env[62277]: ERROR nova.compute.manager [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1825.959900] env[62277]: Faults: ['InvalidArgument'] [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Traceback (most recent call last): [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] self.driver.spawn(context, instance, image_meta, [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] self._fetch_image_if_missing(context, vi) [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] image_cache(vi, tmp_image_ds_loc) [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] vm_util.copy_virtual_disk( [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] session._wait_for_task(vmdk_copy_task) [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] return self.wait_for_task(task_ref) [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] return evt.wait() [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] result = hub.switch() [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] return self.greenlet.switch() [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] self.f(*self.args, **self.kw) [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] raise exceptions.translate_fault(task_info.error) [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Faults: ['InvalidArgument'] [ 1825.959900] env[62277]: ERROR nova.compute.manager [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] [ 1825.960857] env[62277]: DEBUG nova.compute.utils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1825.962814] env[62277]: DEBUG nova.compute.manager [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Build of instance 32bed248-06d5-47a1-b281-47921d99dbf6 was re-scheduled: A specified parameter was not correct: fileType [ 1825.962814] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1825.963257] env[62277]: DEBUG nova.compute.manager [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1825.963257] env[62277]: DEBUG nova.compute.manager [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1825.963366] env[62277]: DEBUG nova.compute.manager [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1825.963530] env[62277]: DEBUG nova.network.neutron [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1826.030693] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c7179512-b148-43e2-bbc6-bc1d50376dcf tempest-ListImageFiltersTestJSON-1819822959 tempest-ListImageFiltersTestJSON-1819822959-project-member] Acquiring lock "71f172bf-94bd-4027-bbae-f3bd3e1f91c3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1826.030693] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c7179512-b148-43e2-bbc6-bc1d50376dcf tempest-ListImageFiltersTestJSON-1819822959 tempest-ListImageFiltersTestJSON-1819822959-project-member] Lock "71f172bf-94bd-4027-bbae-f3bd3e1f91c3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1826.174618] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1826.378799] env[62277]: DEBUG nova.network.neutron [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1826.390652] env[62277]: INFO nova.compute.manager [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Took 0.43 seconds to deallocate network for instance. [ 1826.502823] env[62277]: INFO nova.scheduler.client.report [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Deleted allocations for instance 32bed248-06d5-47a1-b281-47921d99dbf6 [ 1826.529919] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b3af3186-8a85-43f4-be41-a38e9dafbd9d tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Lock "32bed248-06d5-47a1-b281-47921d99dbf6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 604.222s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1826.531149] env[62277]: DEBUG oslo_concurrency.lockutils [None req-88dfee0e-d576-41c1-b681-1766f46949a4 tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Lock "32bed248-06d5-47a1-b281-47921d99dbf6" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 406.757s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1826.531364] env[62277]: DEBUG oslo_concurrency.lockutils [None req-88dfee0e-d576-41c1-b681-1766f46949a4 tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Acquiring lock "32bed248-06d5-47a1-b281-47921d99dbf6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1826.531569] env[62277]: DEBUG oslo_concurrency.lockutils [None req-88dfee0e-d576-41c1-b681-1766f46949a4 tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Lock "32bed248-06d5-47a1-b281-47921d99dbf6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1826.531739] env[62277]: DEBUG oslo_concurrency.lockutils [None req-88dfee0e-d576-41c1-b681-1766f46949a4 tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Lock "32bed248-06d5-47a1-b281-47921d99dbf6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1826.534269] env[62277]: INFO nova.compute.manager [None req-88dfee0e-d576-41c1-b681-1766f46949a4 tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Terminating instance [ 1826.536152] env[62277]: DEBUG nova.compute.manager [None req-88dfee0e-d576-41c1-b681-1766f46949a4 tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1826.536421] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-88dfee0e-d576-41c1-b681-1766f46949a4 tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1826.537111] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-08a8927b-7517-4759-89f7-fb2549f71840 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1826.542407] env[62277]: DEBUG nova.compute.manager [None req-625a5d10-4ad2-4624-8618-29ea970def61 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 2fdebb33-a32b-4753-aa2e-adfc4b252fac] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1826.549824] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ab2a16d-b457-4ddd-b2e2-5ab54937309e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1826.578729] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-88dfee0e-d576-41c1-b681-1766f46949a4 tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 32bed248-06d5-47a1-b281-47921d99dbf6 could not be found. [ 1826.578946] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-88dfee0e-d576-41c1-b681-1766f46949a4 tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1826.579135] env[62277]: INFO nova.compute.manager [None req-88dfee0e-d576-41c1-b681-1766f46949a4 tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1826.579382] env[62277]: DEBUG oslo.service.loopingcall [None req-88dfee0e-d576-41c1-b681-1766f46949a4 tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1826.579797] env[62277]: DEBUG nova.compute.manager [None req-625a5d10-4ad2-4624-8618-29ea970def61 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 2fdebb33-a32b-4753-aa2e-adfc4b252fac] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1826.580674] env[62277]: DEBUG nova.compute.manager [-] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1826.580789] env[62277]: DEBUG nova.network.neutron [-] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1826.604291] env[62277]: DEBUG oslo_concurrency.lockutils [None req-625a5d10-4ad2-4624-8618-29ea970def61 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "2fdebb33-a32b-4753-aa2e-adfc4b252fac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.329s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1826.605646] env[62277]: DEBUG nova.network.neutron [-] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1826.612221] env[62277]: DEBUG nova.compute.manager [None req-78215bee-628c-43dc-8ea4-260fa53b23f2 tempest-VolumesAdminNegativeTest-1493025573 tempest-VolumesAdminNegativeTest-1493025573-project-member] [instance: 5ed6905b-ddb5-4517-a8dc-ee8e00b53db0] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1826.614982] env[62277]: INFO nova.compute.manager [-] [instance: 32bed248-06d5-47a1-b281-47921d99dbf6] Took 0.03 seconds to deallocate network for instance. [ 1826.632972] env[62277]: DEBUG nova.compute.manager [None req-78215bee-628c-43dc-8ea4-260fa53b23f2 tempest-VolumesAdminNegativeTest-1493025573 tempest-VolumesAdminNegativeTest-1493025573-project-member] [instance: 5ed6905b-ddb5-4517-a8dc-ee8e00b53db0] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1826.651517] env[62277]: DEBUG oslo_concurrency.lockutils [None req-78215bee-628c-43dc-8ea4-260fa53b23f2 tempest-VolumesAdminNegativeTest-1493025573 tempest-VolumesAdminNegativeTest-1493025573-project-member] Lock "5ed6905b-ddb5-4517-a8dc-ee8e00b53db0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.916s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1826.660432] env[62277]: DEBUG nova.compute.manager [None req-c98419e1-ce19-48ef-b537-74c6ede412ba tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 9f8669ed-f65f-4472-9ef6-01953c48466b] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1826.682580] env[62277]: DEBUG nova.compute.manager [None req-c98419e1-ce19-48ef-b537-74c6ede412ba tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] [instance: 9f8669ed-f65f-4472-9ef6-01953c48466b] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1826.698956] env[62277]: DEBUG oslo_concurrency.lockutils [None req-88dfee0e-d576-41c1-b681-1766f46949a4 tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Lock "32bed248-06d5-47a1-b281-47921d99dbf6" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.168s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1826.702854] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c98419e1-ce19-48ef-b537-74c6ede412ba tempest-AttachVolumeTestJSON-899048172 tempest-AttachVolumeTestJSON-899048172-project-member] Lock "9f8669ed-f65f-4472-9ef6-01953c48466b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.939s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1826.711174] env[62277]: DEBUG nova.compute.manager [None req-eab2f50d-f473-463c-9429-9f09b3de3993 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 3b8f2c10-5dce-44d0-bb6b-939afc01e44b] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1826.735085] env[62277]: DEBUG nova.compute.manager [None req-eab2f50d-f473-463c-9429-9f09b3de3993 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 3b8f2c10-5dce-44d0-bb6b-939afc01e44b] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1826.754018] env[62277]: DEBUG oslo_concurrency.lockutils [None req-eab2f50d-f473-463c-9429-9f09b3de3993 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Lock "3b8f2c10-5dce-44d0-bb6b-939afc01e44b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.683s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1826.763238] env[62277]: DEBUG nova.compute.manager [None req-1f6a5bbb-98c9-46d5-8d0c-79b5fb6e8c4a tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5a5ff5bf-d965-42e2-aa8b-67be4b5f7362] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1826.788720] env[62277]: DEBUG nova.compute.manager [None req-1f6a5bbb-98c9-46d5-8d0c-79b5fb6e8c4a tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5a5ff5bf-d965-42e2-aa8b-67be4b5f7362] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1826.813402] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1f6a5bbb-98c9-46d5-8d0c-79b5fb6e8c4a tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "5a5ff5bf-d965-42e2-aa8b-67be4b5f7362" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 175.298s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1826.822743] env[62277]: DEBUG nova.compute.manager [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1826.874428] env[62277]: DEBUG oslo_concurrency.lockutils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1826.874673] env[62277]: DEBUG oslo_concurrency.lockutils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1826.876135] env[62277]: INFO nova.compute.claims [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1826.938583] env[62277]: DEBUG nova.scheduler.client.report [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Refreshing inventories for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1826.953752] env[62277]: DEBUG nova.scheduler.client.report [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Updating ProviderTree inventory for provider 75e125ea-a599-4b65-b9cd-6ea881735292 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1826.953982] env[62277]: DEBUG nova.compute.provider_tree [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Updating inventory in ProviderTree for provider 75e125ea-a599-4b65-b9cd-6ea881735292 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1826.965388] env[62277]: DEBUG nova.scheduler.client.report [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Refreshing aggregate associations for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292, aggregates: None {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1826.984851] env[62277]: DEBUG nova.scheduler.client.report [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Refreshing trait associations for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1827.168472] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1827.168648] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 1827.168771] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 1827.184677] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49e84195-5962-4975-8353-96b10bcbe508 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1827.189832] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1827.189983] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1827.190125] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1827.190248] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 13959890-87a1-45ba-98de-621373e265e7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1827.190369] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1827.190487] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1827.190605] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1827.190721] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1827.190871] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1827.190997] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1827.191129] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 1827.192206] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1827.192387] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1827.196469] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bada3ae5-26b9-42a8-ab9f-6c3c44578a90 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1827.225126] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-504b1b23-903d-4819-9ce2-1e0e41da04d5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1827.232329] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6dcbea7-d433-4b7f-8f23-c5f935fa0ea7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1827.246274] env[62277]: DEBUG nova.compute.provider_tree [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1827.255856] env[62277]: DEBUG nova.scheduler.client.report [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1827.272231] env[62277]: DEBUG oslo_concurrency.lockutils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.397s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1827.272697] env[62277]: DEBUG nova.compute.manager [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1827.304170] env[62277]: DEBUG nova.compute.utils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1827.305729] env[62277]: DEBUG nova.compute.manager [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1827.305902] env[62277]: DEBUG nova.network.neutron [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1827.313738] env[62277]: DEBUG nova.compute.manager [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1827.362606] env[62277]: DEBUG nova.policy [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '00ed93b61873452bbc15280d2de65bd8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c951cee39d94e49af963590cccf95fb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1827.399061] env[62277]: DEBUG nova.compute.manager [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1827.425773] env[62277]: DEBUG nova.virt.hardware [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1827.426040] env[62277]: DEBUG nova.virt.hardware [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1827.426204] env[62277]: DEBUG nova.virt.hardware [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1827.426391] env[62277]: DEBUG nova.virt.hardware [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1827.426554] env[62277]: DEBUG nova.virt.hardware [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1827.426706] env[62277]: DEBUG nova.virt.hardware [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1827.426906] env[62277]: DEBUG nova.virt.hardware [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1827.427075] env[62277]: DEBUG nova.virt.hardware [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1827.427245] env[62277]: DEBUG nova.virt.hardware [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1827.427405] env[62277]: DEBUG nova.virt.hardware [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1827.427573] env[62277]: DEBUG nova.virt.hardware [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1827.428494] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb2d794e-d3c3-4d4b-a49d-0732d46767c2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1827.436449] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08225b87-ad7c-4788-b783-911c6eb96983 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1827.688312] env[62277]: DEBUG nova.network.neutron [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Successfully created port: 79d9f3da-3410-429a-b6cb-b29a80f7146b {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1828.416146] env[62277]: DEBUG nova.compute.manager [req-c07f1da1-a86f-4f64-bf8a-6136fb771c34 req-3f866655-2e71-4847-8abd-e32b59e48c11 service nova] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Received event network-vif-plugged-79d9f3da-3410-429a-b6cb-b29a80f7146b {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1828.416146] env[62277]: DEBUG oslo_concurrency.lockutils [req-c07f1da1-a86f-4f64-bf8a-6136fb771c34 req-3f866655-2e71-4847-8abd-e32b59e48c11 service nova] Acquiring lock "400beb27-a709-4ef4-851e-5caaab9ca60b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1828.416146] env[62277]: DEBUG oslo_concurrency.lockutils [req-c07f1da1-a86f-4f64-bf8a-6136fb771c34 req-3f866655-2e71-4847-8abd-e32b59e48c11 service nova] Lock "400beb27-a709-4ef4-851e-5caaab9ca60b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1828.416146] env[62277]: DEBUG oslo_concurrency.lockutils [req-c07f1da1-a86f-4f64-bf8a-6136fb771c34 req-3f866655-2e71-4847-8abd-e32b59e48c11 service nova] Lock "400beb27-a709-4ef4-851e-5caaab9ca60b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1828.416146] env[62277]: DEBUG nova.compute.manager [req-c07f1da1-a86f-4f64-bf8a-6136fb771c34 req-3f866655-2e71-4847-8abd-e32b59e48c11 service nova] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] No waiting events found dispatching network-vif-plugged-79d9f3da-3410-429a-b6cb-b29a80f7146b {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1828.416146] env[62277]: WARNING nova.compute.manager [req-c07f1da1-a86f-4f64-bf8a-6136fb771c34 req-3f866655-2e71-4847-8abd-e32b59e48c11 service nova] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Received unexpected event network-vif-plugged-79d9f3da-3410-429a-b6cb-b29a80f7146b for instance with vm_state building and task_state spawning. [ 1828.520383] env[62277]: DEBUG nova.network.neutron [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Successfully updated port: 79d9f3da-3410-429a-b6cb-b29a80f7146b {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1828.536726] env[62277]: DEBUG oslo_concurrency.lockutils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "refresh_cache-400beb27-a709-4ef4-851e-5caaab9ca60b" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1828.536825] env[62277]: DEBUG oslo_concurrency.lockutils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquired lock "refresh_cache-400beb27-a709-4ef4-851e-5caaab9ca60b" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1828.536965] env[62277]: DEBUG nova.network.neutron [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1828.589656] env[62277]: DEBUG nova.network.neutron [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1828.758383] env[62277]: DEBUG nova.network.neutron [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Updating instance_info_cache with network_info: [{"id": "79d9f3da-3410-429a-b6cb-b29a80f7146b", "address": "fa:16:3e:20:16:69", "network": {"id": "73ff19f8-2a7e-46b5-ad84-44eb5672dd85", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1025699415-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c951cee39d94e49af963590cccf95fb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "09bf081b-cdf0-4977-abe2-2339a87409ab", "external-id": "nsx-vlan-transportzone-378", "segmentation_id": 378, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap79d9f3da-34", "ovs_interfaceid": "79d9f3da-3410-429a-b6cb-b29a80f7146b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1828.769132] env[62277]: DEBUG oslo_concurrency.lockutils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Releasing lock "refresh_cache-400beb27-a709-4ef4-851e-5caaab9ca60b" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1828.769389] env[62277]: DEBUG nova.compute.manager [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Instance network_info: |[{"id": "79d9f3da-3410-429a-b6cb-b29a80f7146b", "address": "fa:16:3e:20:16:69", "network": {"id": "73ff19f8-2a7e-46b5-ad84-44eb5672dd85", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1025699415-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c951cee39d94e49af963590cccf95fb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "09bf081b-cdf0-4977-abe2-2339a87409ab", "external-id": "nsx-vlan-transportzone-378", "segmentation_id": 378, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap79d9f3da-34", "ovs_interfaceid": "79d9f3da-3410-429a-b6cb-b29a80f7146b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1828.769798] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:20:16:69', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '09bf081b-cdf0-4977-abe2-2339a87409ab', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '79d9f3da-3410-429a-b6cb-b29a80f7146b', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1828.777325] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Creating folder: Project (0c951cee39d94e49af963590cccf95fb). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1828.777839] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-525de67a-3464-462c-8cf6-4bf237572c77 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1828.788691] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Created folder: Project (0c951cee39d94e49af963590cccf95fb) in parent group-v297781. [ 1828.788831] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Creating folder: Instances. Parent ref: group-v297871. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1828.789045] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-887f0f8a-a47a-429f-ba58-d0317576ba3d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1828.797504] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Created folder: Instances in parent group-v297871. [ 1828.797755] env[62277]: DEBUG oslo.service.loopingcall [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1828.797933] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1828.798126] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b60c0e2a-9d15-4bca-902d-ee6f5081701a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1828.815380] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1828.815380] env[62277]: value = "task-1405458" [ 1828.815380] env[62277]: _type = "Task" [ 1828.815380] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1828.822207] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405458, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1829.168203] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1829.191818] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1829.324944] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405458, 'name': CreateVM_Task, 'duration_secs': 0.321448} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1829.325091] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1829.325738] env[62277]: DEBUG oslo_concurrency.lockutils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1829.325902] env[62277]: DEBUG oslo_concurrency.lockutils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1829.326248] env[62277]: DEBUG oslo_concurrency.lockutils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1829.326515] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-57e9303a-90cc-41f6-a4a1-0a390d48677b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1829.330584] env[62277]: DEBUG oslo_vmware.api [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Waiting for the task: (returnval){ [ 1829.330584] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]529686f0-b614-c565-dd51-b9fabce405fa" [ 1829.330584] env[62277]: _type = "Task" [ 1829.330584] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1829.340770] env[62277]: DEBUG oslo_vmware.api [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]529686f0-b614-c565-dd51-b9fabce405fa, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1829.840160] env[62277]: DEBUG oslo_concurrency.lockutils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1829.840476] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1829.840552] env[62277]: DEBUG oslo_concurrency.lockutils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1830.168641] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1830.180642] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1830.180872] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1830.181048] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1830.181210] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1830.182327] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ded9d908-291e-43a4-a5eb-45c738c87f46 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1830.191203] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1cfd1a2-b957-47a6-9d7a-201887cb2278 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1830.205181] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50b3a22d-e833-4bd0-a59c-3e6768c53c11 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1830.211770] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dae017f6-bb32-4a0d-a08b-089562d34f4f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1830.241799] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181398MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1830.241799] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1830.242012] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1830.319556] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8d00162c-7379-48b6-841b-f802db2582db actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1830.319744] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 900160c8-a715-45a4-8709-b314fc3216d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1830.319873] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 63267d5c-d004-41c1-866a-75b9e37521b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1830.319996] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 13959890-87a1-45ba-98de-621373e265e7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1830.320130] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance a7cc7e45-8567-4699-af83-624b1c7c5c64 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1830.320247] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 42005809-1926-44b2-8ef6-3b6cb28a4020 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1830.320363] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1830.320477] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1830.320595] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 163eb4e7-33f8-4674-8a3f-5094356e250d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1830.320707] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 400beb27-a709-4ef4-851e-5caaab9ca60b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1830.332293] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 272391f1-a349-4525-91ec-75b3ba7aeb1c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1830.343823] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 227394fe-d0c6-48c8-aed2-433ce34e34f8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1830.354614] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9f7d4431-d5ea-4f9b-888b-77a6a7772047 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1830.363997] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8b9ef530-e79f-4cd4-8a88-83871ed65f90 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1830.373468] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9bf5cf30-5142-4d51-b5d9-1bbfb9eedce8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1830.382256] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c4c22c8a-4300-45ce-8484-77c638f7bbc5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1830.392980] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 97a1dad9-a665-42ca-b85e-5fef59ab80bf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1830.402222] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 7748e3a1-6adc-4482-90f9-a3816a224272 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1830.411867] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 71f172bf-94bd-4027-bbae-f3bd3e1f91c3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1830.412113] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1830.412262] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1830.446421] env[62277]: DEBUG nova.compute.manager [req-899c1889-2167-4d92-a8a1-c589762ed11c req-320297d2-f62e-4ce7-bf7b-4daacf80eada service nova] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Received event network-changed-79d9f3da-3410-429a-b6cb-b29a80f7146b {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1830.446658] env[62277]: DEBUG nova.compute.manager [req-899c1889-2167-4d92-a8a1-c589762ed11c req-320297d2-f62e-4ce7-bf7b-4daacf80eada service nova] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Refreshing instance network info cache due to event network-changed-79d9f3da-3410-429a-b6cb-b29a80f7146b. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1830.446867] env[62277]: DEBUG oslo_concurrency.lockutils [req-899c1889-2167-4d92-a8a1-c589762ed11c req-320297d2-f62e-4ce7-bf7b-4daacf80eada service nova] Acquiring lock "refresh_cache-400beb27-a709-4ef4-851e-5caaab9ca60b" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1830.447009] env[62277]: DEBUG oslo_concurrency.lockutils [req-899c1889-2167-4d92-a8a1-c589762ed11c req-320297d2-f62e-4ce7-bf7b-4daacf80eada service nova] Acquired lock "refresh_cache-400beb27-a709-4ef4-851e-5caaab9ca60b" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1830.447172] env[62277]: DEBUG nova.network.neutron [req-899c1889-2167-4d92-a8a1-c589762ed11c req-320297d2-f62e-4ce7-bf7b-4daacf80eada service nova] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Refreshing network info cache for port 79d9f3da-3410-429a-b6cb-b29a80f7146b {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1830.637854] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a34af0e9-cfef-4a29-bf77-24f29d48c94a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1830.646143] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8424e2e3-6d77-43c1-bbd1-7d87ab8cf57f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1830.677534] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5890aa4b-bc21-46f0-b6d5-8176cfc9c767 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1830.685067] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0557cc6-9980-45bf-9ec5-6eaf28c85797 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1830.698353] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1830.706969] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1830.723703] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1830.723902] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.482s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1830.744451] env[62277]: DEBUG nova.network.neutron [req-899c1889-2167-4d92-a8a1-c589762ed11c req-320297d2-f62e-4ce7-bf7b-4daacf80eada service nova] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Updated VIF entry in instance network info cache for port 79d9f3da-3410-429a-b6cb-b29a80f7146b. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1830.744786] env[62277]: DEBUG nova.network.neutron [req-899c1889-2167-4d92-a8a1-c589762ed11c req-320297d2-f62e-4ce7-bf7b-4daacf80eada service nova] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Updating instance_info_cache with network_info: [{"id": "79d9f3da-3410-429a-b6cb-b29a80f7146b", "address": "fa:16:3e:20:16:69", "network": {"id": "73ff19f8-2a7e-46b5-ad84-44eb5672dd85", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1025699415-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c951cee39d94e49af963590cccf95fb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "09bf081b-cdf0-4977-abe2-2339a87409ab", "external-id": "nsx-vlan-transportzone-378", "segmentation_id": 378, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap79d9f3da-34", "ovs_interfaceid": "79d9f3da-3410-429a-b6cb-b29a80f7146b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1830.754436] env[62277]: DEBUG oslo_concurrency.lockutils [req-899c1889-2167-4d92-a8a1-c589762ed11c req-320297d2-f62e-4ce7-bf7b-4daacf80eada service nova] Releasing lock "refresh_cache-400beb27-a709-4ef4-851e-5caaab9ca60b" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1831.724413] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1832.169140] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1832.169320] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 1832.169532] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1841.024801] env[62277]: DEBUG oslo_concurrency.lockutils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquiring lock "ad83bf06-d712-4bd4-8086-9c3b615adaf5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1841.025219] env[62277]: DEBUG oslo_concurrency.lockutils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "ad83bf06-d712-4bd4-8086-9c3b615adaf5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1848.296031] env[62277]: DEBUG oslo_concurrency.lockutils [None req-88bc2231-abac-4b6e-82e1-07919e9ebcd4 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "400beb27-a709-4ef4-851e-5caaab9ca60b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1855.688773] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79b9b1c6-a222-4945-bee1-3943b02a2efc tempest-AttachVolumeNegativeTest-184472694 tempest-AttachVolumeNegativeTest-184472694-project-member] Acquiring lock "e0c4112c-6bc6-44e4-8e43-fda8203bf1c3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1855.689097] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79b9b1c6-a222-4945-bee1-3943b02a2efc tempest-AttachVolumeNegativeTest-184472694 tempest-AttachVolumeNegativeTest-184472694-project-member] Lock "e0c4112c-6bc6-44e4-8e43-fda8203bf1c3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1862.088928] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0afa8b9c-dcfc-44a8-a6b0-bd433c5ab7c1 tempest-ServersNegativeTestJSON-1317204946 tempest-ServersNegativeTestJSON-1317204946-project-member] Acquiring lock "81c79e22-aaa3-45dc-967a-b4a884f692eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1862.089236] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0afa8b9c-dcfc-44a8-a6b0-bd433c5ab7c1 tempest-ServersNegativeTestJSON-1317204946 tempest-ServersNegativeTestJSON-1317204946-project-member] Lock "81c79e22-aaa3-45dc-967a-b4a884f692eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1870.081701] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_power_states {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1870.103828] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Getting list of instances from cluster (obj){ [ 1870.103828] env[62277]: value = "domain-c8" [ 1870.103828] env[62277]: _type = "ClusterComputeResource" [ 1870.103828] env[62277]: } {{(pid=62277) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1870.105183] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22ff3fab-2d5a-4864-8b14-e2d8ff167dc3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.123615] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Got total of 10 instances {{(pid=62277) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1870.123786] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 8d00162c-7379-48b6-841b-f802db2582db {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1870.123994] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 900160c8-a715-45a4-8709-b314fc3216d5 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1870.124176] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 63267d5c-d004-41c1-866a-75b9e37521b7 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1870.124332] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 13959890-87a1-45ba-98de-621373e265e7 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1870.124480] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid a7cc7e45-8567-4699-af83-624b1c7c5c64 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1870.124628] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 42005809-1926-44b2-8ef6-3b6cb28a4020 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1870.124778] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1870.124923] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1870.125115] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 163eb4e7-33f8-4674-8a3f-5094356e250d {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1870.125276] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 400beb27-a709-4ef4-851e-5caaab9ca60b {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 1870.125600] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "8d00162c-7379-48b6-841b-f802db2582db" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1870.125837] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "900160c8-a715-45a4-8709-b314fc3216d5" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1870.126049] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "63267d5c-d004-41c1-866a-75b9e37521b7" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1870.126251] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "13959890-87a1-45ba-98de-621373e265e7" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1870.126446] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "a7cc7e45-8567-4699-af83-624b1c7c5c64" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1870.126643] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "42005809-1926-44b2-8ef6-3b6cb28a4020" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1870.126865] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1870.127083] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1870.127281] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "163eb4e7-33f8-4674-8a3f-5094356e250d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1870.127473] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "400beb27-a709-4ef4-851e-5caaab9ca60b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1874.399482] env[62277]: WARNING oslo_vmware.rw_handles [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1874.399482] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1874.399482] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1874.399482] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1874.399482] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1874.399482] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1874.399482] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1874.399482] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1874.399482] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1874.399482] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1874.399482] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1874.399482] env[62277]: ERROR oslo_vmware.rw_handles [ 1874.400284] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/8cf51728-8e0f-475f-8926-eca13b55895f/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1874.402089] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1874.402358] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Copying Virtual Disk [datastore2] vmware_temp/8cf51728-8e0f-475f-8926-eca13b55895f/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/8cf51728-8e0f-475f-8926-eca13b55895f/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1874.402644] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e54257eb-d51f-4700-a140-7900c587b9a1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1874.411498] env[62277]: DEBUG oslo_vmware.api [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Waiting for the task: (returnval){ [ 1874.411498] env[62277]: value = "task-1405459" [ 1874.411498] env[62277]: _type = "Task" [ 1874.411498] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1874.419188] env[62277]: DEBUG oslo_vmware.api [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Task: {'id': task-1405459, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1874.922643] env[62277]: DEBUG oslo_vmware.exceptions [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1874.922938] env[62277]: DEBUG oslo_concurrency.lockutils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1874.923511] env[62277]: ERROR nova.compute.manager [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1874.923511] env[62277]: Faults: ['InvalidArgument'] [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] Traceback (most recent call last): [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] yield resources [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] self.driver.spawn(context, instance, image_meta, [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] self._fetch_image_if_missing(context, vi) [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] image_cache(vi, tmp_image_ds_loc) [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] vm_util.copy_virtual_disk( [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] session._wait_for_task(vmdk_copy_task) [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] return self.wait_for_task(task_ref) [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] return evt.wait() [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] result = hub.switch() [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] return self.greenlet.switch() [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] self.f(*self.args, **self.kw) [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] raise exceptions.translate_fault(task_info.error) [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] Faults: ['InvalidArgument'] [ 1874.923511] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] [ 1874.924455] env[62277]: INFO nova.compute.manager [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Terminating instance [ 1874.925382] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1874.925587] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1874.925815] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-56d3833c-ecc9-48b1-a1da-117d929b6a04 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1874.928121] env[62277]: DEBUG nova.compute.manager [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1874.928263] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1874.929225] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63d93713-6b70-44c7-9669-8e425383dffd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1874.936148] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1874.936363] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-09541ada-6e24-4392-bab1-37b548e4336c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1874.938459] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1874.938640] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1874.939575] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-60519b84-e4e1-4481-afb3-9861e272639b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1874.944319] env[62277]: DEBUG oslo_vmware.api [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Waiting for the task: (returnval){ [ 1874.944319] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52ac689c-5c40-eaa9-4e45-68c813a22695" [ 1874.944319] env[62277]: _type = "Task" [ 1874.944319] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1874.951602] env[62277]: DEBUG oslo_vmware.api [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52ac689c-5c40-eaa9-4e45-68c813a22695, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1875.001707] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1875.001929] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1875.002123] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Deleting the datastore file [datastore2] 8d00162c-7379-48b6-841b-f802db2582db {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1875.002401] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7f82c2e2-3fc3-49c9-84d4-ffa640e2af9a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1875.008050] env[62277]: DEBUG oslo_vmware.api [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Waiting for the task: (returnval){ [ 1875.008050] env[62277]: value = "task-1405461" [ 1875.008050] env[62277]: _type = "Task" [ 1875.008050] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1875.015895] env[62277]: DEBUG oslo_vmware.api [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Task: {'id': task-1405461, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1875.454327] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1875.454624] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Creating directory with path [datastore2] vmware_temp/15e9f699-88f3-4b09-9cae-8854629f111f/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1875.454783] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-dc0c89de-ad60-4ad5-802d-a5d154144100 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1875.465449] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Created directory with path [datastore2] vmware_temp/15e9f699-88f3-4b09-9cae-8854629f111f/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1875.465639] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Fetch image to [datastore2] vmware_temp/15e9f699-88f3-4b09-9cae-8854629f111f/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1875.465802] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/15e9f699-88f3-4b09-9cae-8854629f111f/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1875.466528] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be4b4499-3ac1-4e44-9ed9-da783a45aafd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1875.472940] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77051fb5-f822-45e4-a4f8-46b5bb1d9517 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1875.481493] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcd39238-c0ca-477a-b737-2a5f47fd1ffa {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1875.514181] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad155515-67e3-43d2-a852-59bed5982170 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1875.520616] env[62277]: DEBUG oslo_vmware.api [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Task: {'id': task-1405461, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074542} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1875.521946] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1875.522143] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1875.522308] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1875.522474] env[62277]: INFO nova.compute.manager [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1875.524162] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-80e56364-9aaa-4dff-907a-f42c98729d62 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1875.525925] env[62277]: DEBUG nova.compute.claims [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1875.526107] env[62277]: DEBUG oslo_concurrency.lockutils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1875.526317] env[62277]: DEBUG oslo_concurrency.lockutils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1875.545557] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1875.596433] env[62277]: DEBUG oslo_vmware.rw_handles [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/15e9f699-88f3-4b09-9cae-8854629f111f/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1875.656996] env[62277]: DEBUG oslo_vmware.rw_handles [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1875.657194] env[62277]: DEBUG oslo_vmware.rw_handles [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/15e9f699-88f3-4b09-9cae-8854629f111f/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1875.846498] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c0bdc62-3b72-426d-9fca-6a044284c2ee {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1875.853939] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1ca96e2-2122-4586-b6a5-45c6fae533de {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1875.883833] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ced2043-dd8c-4227-a650-9df1d2362c0f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1875.891189] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-048aaacb-6084-4c00-b9ae-ebdee1310231 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1875.906102] env[62277]: DEBUG nova.compute.provider_tree [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1875.914011] env[62277]: DEBUG nova.scheduler.client.report [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1875.926667] env[62277]: DEBUG oslo_concurrency.lockutils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.400s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1875.927237] env[62277]: ERROR nova.compute.manager [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1875.927237] env[62277]: Faults: ['InvalidArgument'] [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] Traceback (most recent call last): [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] self.driver.spawn(context, instance, image_meta, [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] self._fetch_image_if_missing(context, vi) [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] image_cache(vi, tmp_image_ds_loc) [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] vm_util.copy_virtual_disk( [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] session._wait_for_task(vmdk_copy_task) [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] return self.wait_for_task(task_ref) [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] return evt.wait() [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] result = hub.switch() [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] return self.greenlet.switch() [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] self.f(*self.args, **self.kw) [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] raise exceptions.translate_fault(task_info.error) [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] Faults: ['InvalidArgument'] [ 1875.927237] env[62277]: ERROR nova.compute.manager [instance: 8d00162c-7379-48b6-841b-f802db2582db] [ 1875.928189] env[62277]: DEBUG nova.compute.utils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1875.929601] env[62277]: DEBUG nova.compute.manager [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Build of instance 8d00162c-7379-48b6-841b-f802db2582db was re-scheduled: A specified parameter was not correct: fileType [ 1875.929601] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1875.930030] env[62277]: DEBUG nova.compute.manager [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1875.930209] env[62277]: DEBUG nova.compute.manager [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1875.930377] env[62277]: DEBUG nova.compute.manager [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1875.930536] env[62277]: DEBUG nova.network.neutron [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1876.319591] env[62277]: DEBUG nova.network.neutron [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1876.333968] env[62277]: INFO nova.compute.manager [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Took 0.40 seconds to deallocate network for instance. [ 1876.422722] env[62277]: INFO nova.scheduler.client.report [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Deleted allocations for instance 8d00162c-7379-48b6-841b-f802db2582db [ 1876.444805] env[62277]: DEBUG oslo_concurrency.lockutils [None req-58d1c1ac-a102-40ed-9b22-5098c47561bc tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Lock "8d00162c-7379-48b6-841b-f802db2582db" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 637.442s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1876.445935] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a00ef9d3-8870-42b4-a406-76583c695098 tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Lock "8d00162c-7379-48b6-841b-f802db2582db" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 440.253s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1876.446183] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a00ef9d3-8870-42b4-a406-76583c695098 tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Acquiring lock "8d00162c-7379-48b6-841b-f802db2582db-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1876.446385] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a00ef9d3-8870-42b4-a406-76583c695098 tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Lock "8d00162c-7379-48b6-841b-f802db2582db-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1876.446549] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a00ef9d3-8870-42b4-a406-76583c695098 tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Lock "8d00162c-7379-48b6-841b-f802db2582db-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1876.448467] env[62277]: INFO nova.compute.manager [None req-a00ef9d3-8870-42b4-a406-76583c695098 tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Terminating instance [ 1876.450127] env[62277]: DEBUG nova.compute.manager [None req-a00ef9d3-8870-42b4-a406-76583c695098 tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1876.450321] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a00ef9d3-8870-42b4-a406-76583c695098 tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1876.450789] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5479e771-61e1-42e9-a90c-b42f382f0cf1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1876.459946] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f638a0c-a7c6-4ba1-be2b-f770dac63b92 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1876.489356] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-a00ef9d3-8870-42b4-a406-76583c695098 tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8d00162c-7379-48b6-841b-f802db2582db could not be found. [ 1876.489567] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a00ef9d3-8870-42b4-a406-76583c695098 tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1876.489792] env[62277]: INFO nova.compute.manager [None req-a00ef9d3-8870-42b4-a406-76583c695098 tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1876.490115] env[62277]: DEBUG oslo.service.loopingcall [None req-a00ef9d3-8870-42b4-a406-76583c695098 tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1876.490464] env[62277]: DEBUG nova.compute.manager [None req-ef4504dd-12e5-4062-9061-5368df2cee5e tempest-AttachVolumeNegativeTest-184472694 tempest-AttachVolumeNegativeTest-184472694-project-member] [instance: 272391f1-a349-4525-91ec-75b3ba7aeb1c] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1876.492967] env[62277]: DEBUG nova.compute.manager [-] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1876.493085] env[62277]: DEBUG nova.network.neutron [-] [instance: 8d00162c-7379-48b6-841b-f802db2582db] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1876.515726] env[62277]: DEBUG nova.compute.manager [None req-ef4504dd-12e5-4062-9061-5368df2cee5e tempest-AttachVolumeNegativeTest-184472694 tempest-AttachVolumeNegativeTest-184472694-project-member] [instance: 272391f1-a349-4525-91ec-75b3ba7aeb1c] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1876.522408] env[62277]: DEBUG nova.network.neutron [-] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1876.529655] env[62277]: INFO nova.compute.manager [-] [instance: 8d00162c-7379-48b6-841b-f802db2582db] Took 0.04 seconds to deallocate network for instance. [ 1876.538906] env[62277]: DEBUG oslo_concurrency.lockutils [None req-ef4504dd-12e5-4062-9061-5368df2cee5e tempest-AttachVolumeNegativeTest-184472694 tempest-AttachVolumeNegativeTest-184472694-project-member] Lock "272391f1-a349-4525-91ec-75b3ba7aeb1c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.945s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1876.553854] env[62277]: DEBUG nova.compute.manager [None req-76e59b66-33b0-4df5-9a06-03c5bacc073b tempest-ServerPasswordTestJSON-23033146 tempest-ServerPasswordTestJSON-23033146-project-member] [instance: 227394fe-d0c6-48c8-aed2-433ce34e34f8] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1876.581570] env[62277]: DEBUG nova.compute.manager [None req-76e59b66-33b0-4df5-9a06-03c5bacc073b tempest-ServerPasswordTestJSON-23033146 tempest-ServerPasswordTestJSON-23033146-project-member] [instance: 227394fe-d0c6-48c8-aed2-433ce34e34f8] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1876.602404] env[62277]: DEBUG oslo_concurrency.lockutils [None req-76e59b66-33b0-4df5-9a06-03c5bacc073b tempest-ServerPasswordTestJSON-23033146 tempest-ServerPasswordTestJSON-23033146-project-member] Lock "227394fe-d0c6-48c8-aed2-433ce34e34f8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.929s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1876.614482] env[62277]: DEBUG nova.compute.manager [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1876.651381] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a00ef9d3-8870-42b4-a406-76583c695098 tempest-ServerTagsTestJSON-125218464 tempest-ServerTagsTestJSON-125218464-project-member] Lock "8d00162c-7379-48b6-841b-f802db2582db" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.205s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1876.652557] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "8d00162c-7379-48b6-841b-f802db2582db" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 6.527s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1876.652557] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 8d00162c-7379-48b6-841b-f802db2582db] During sync_power_state the instance has a pending task (deleting). Skip. [ 1876.652557] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "8d00162c-7379-48b6-841b-f802db2582db" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1876.668701] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1876.668960] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1876.670338] env[62277]: INFO nova.compute.claims [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1876.933784] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f2a2bcb-6025-48a2-ad22-e89df4b92857 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1876.942234] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f83aa90-0578-4095-9217-a563ea4b26d5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1876.972820] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da47d08a-8cfc-4ae2-8ba1-95f08cd57344 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1876.980328] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a17019e8-6701-4c96-931d-b025b4ee588a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1876.993183] env[62277]: DEBUG nova.compute.provider_tree [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1877.001691] env[62277]: DEBUG nova.scheduler.client.report [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1877.016220] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.347s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1877.016644] env[62277]: DEBUG nova.compute.manager [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1877.046490] env[62277]: DEBUG nova.compute.utils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1877.047936] env[62277]: DEBUG nova.compute.manager [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Not allocating networking since 'none' was specified. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1877.056519] env[62277]: DEBUG nova.compute.manager [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1877.117831] env[62277]: DEBUG nova.compute.manager [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1877.144165] env[62277]: DEBUG nova.virt.hardware [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1877.144432] env[62277]: DEBUG nova.virt.hardware [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1877.144588] env[62277]: DEBUG nova.virt.hardware [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1877.144771] env[62277]: DEBUG nova.virt.hardware [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1877.144916] env[62277]: DEBUG nova.virt.hardware [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1877.145081] env[62277]: DEBUG nova.virt.hardware [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1877.145293] env[62277]: DEBUG nova.virt.hardware [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1877.145447] env[62277]: DEBUG nova.virt.hardware [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1877.145608] env[62277]: DEBUG nova.virt.hardware [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1877.145765] env[62277]: DEBUG nova.virt.hardware [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1877.145964] env[62277]: DEBUG nova.virt.hardware [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1877.146815] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f81158e2-a0f2-403b-90d5-630cee9e656f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1877.156435] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3bf6de0-bccf-4481-a611-c577cef503db {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1877.169616] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Instance VIF info [] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1877.175806] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Creating folder: Project (71cdb6e1a99a4128918b0d46038d456e). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1877.176079] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3ae1d4ca-9cac-4d90-bdd6-bb8b61ad124e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1877.185600] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Created folder: Project (71cdb6e1a99a4128918b0d46038d456e) in parent group-v297781. [ 1877.185770] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Creating folder: Instances. Parent ref: group-v297874. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1877.185994] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8f95d632-2575-418a-a561-6e9bf8803b2b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1877.194419] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Created folder: Instances in parent group-v297874. [ 1877.194639] env[62277]: DEBUG oslo.service.loopingcall [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1877.194810] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1877.194996] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f847ba0d-c167-4c51-a6c5-23cf0264ad8b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1877.211851] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1877.211851] env[62277]: value = "task-1405464" [ 1877.211851] env[62277]: _type = "Task" [ 1877.211851] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1877.218963] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405464, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1877.721699] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405464, 'name': CreateVM_Task, 'duration_secs': 0.252372} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1877.722674] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1877.722674] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1877.722817] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1877.724040] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1877.724040] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f4f56cb2-3213-481f-addf-ebe38c2eb98a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1877.727828] env[62277]: DEBUG oslo_vmware.api [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Waiting for the task: (returnval){ [ 1877.727828] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]524c5748-680a-6bbd-3eea-9edc715f999f" [ 1877.727828] env[62277]: _type = "Task" [ 1877.727828] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1877.735203] env[62277]: DEBUG oslo_vmware.api [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]524c5748-680a-6bbd-3eea-9edc715f999f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1878.238148] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1878.238406] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1878.238615] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1879.239438] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Acquiring lock "9f7d4431-d5ea-4f9b-888b-77a6a7772047" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1885.214475] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1887.164334] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1887.167936] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1889.168409] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1889.168730] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 1889.168730] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 1889.190295] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1889.190506] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1889.190571] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 13959890-87a1-45ba-98de-621373e265e7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1889.190702] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1889.190841] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1889.190934] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1889.191066] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1889.191227] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1889.191323] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1889.191417] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1889.191537] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 1889.192039] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1890.169231] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1890.185078] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1890.185316] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1890.185479] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1890.185632] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1890.187166] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fa6c464-793f-4bcc-8693-c77b8b378c7a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1890.195419] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf578dc6-3a67-41f7-bd21-62786e61099b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1890.210420] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcde3c93-003f-4862-9181-055e37df41a3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1890.216484] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8390cc4d-df8f-475b-a3e8-dbebf4d6e281 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1890.245048] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181436MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1890.245224] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1890.245350] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1890.315733] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 900160c8-a715-45a4-8709-b314fc3216d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1890.315820] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 63267d5c-d004-41c1-866a-75b9e37521b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1890.315942] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 13959890-87a1-45ba-98de-621373e265e7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1890.316121] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance a7cc7e45-8567-4699-af83-624b1c7c5c64 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1890.316176] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 42005809-1926-44b2-8ef6-3b6cb28a4020 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1890.316299] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1890.316417] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1890.316530] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 163eb4e7-33f8-4674-8a3f-5094356e250d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1890.316642] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 400beb27-a709-4ef4-851e-5caaab9ca60b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1890.316754] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9f7d4431-d5ea-4f9b-888b-77a6a7772047 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1890.327457] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8b9ef530-e79f-4cd4-8a88-83871ed65f90 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1890.337992] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9bf5cf30-5142-4d51-b5d9-1bbfb9eedce8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1890.348154] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c4c22c8a-4300-45ce-8484-77c638f7bbc5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1890.359590] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 97a1dad9-a665-42ca-b85e-5fef59ab80bf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1890.369179] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 7748e3a1-6adc-4482-90f9-a3816a224272 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1890.380799] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 71f172bf-94bd-4027-bbae-f3bd3e1f91c3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1890.389510] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance ad83bf06-d712-4bd4-8086-9c3b615adaf5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1890.400243] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance e0c4112c-6bc6-44e4-8e43-fda8203bf1c3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1890.410769] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 81c79e22-aaa3-45dc-967a-b4a884f692eb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1890.410769] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1890.410769] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1890.604864] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76dd78b3-03a1-4762-b242-7ed13d5e050d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1890.612482] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06934f9b-978c-4184-baed-df54b14c6e99 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1890.642113] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9416681-98c4-430b-b689-9a1594277e5e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1890.648652] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b455d50-77dc-48d4-aaee-713c5aed8893 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1890.660789] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1890.668309] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1890.683725] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1890.683897] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.439s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1891.684365] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1892.822168] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "de543a46-26c3-40b3-9ccd-80bb1f9845d7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1892.822496] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "de543a46-26c3-40b3-9ccd-80bb1f9845d7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1893.168160] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1894.168696] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1894.169253] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 1923.441145] env[62277]: WARNING oslo_vmware.rw_handles [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1923.441145] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1923.441145] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1923.441145] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1923.441145] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1923.441145] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1923.441145] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1923.441145] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1923.441145] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1923.441145] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1923.441145] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1923.441145] env[62277]: ERROR oslo_vmware.rw_handles [ 1923.441855] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/15e9f699-88f3-4b09-9cae-8854629f111f/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1923.443549] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1923.443800] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Copying Virtual Disk [datastore2] vmware_temp/15e9f699-88f3-4b09-9cae-8854629f111f/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/15e9f699-88f3-4b09-9cae-8854629f111f/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1923.444104] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-dc94d267-07d4-490f-9b6b-8192d8d1f76f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1923.452858] env[62277]: DEBUG oslo_vmware.api [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Waiting for the task: (returnval){ [ 1923.452858] env[62277]: value = "task-1405465" [ 1923.452858] env[62277]: _type = "Task" [ 1923.452858] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1923.461317] env[62277]: DEBUG oslo_vmware.api [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Task: {'id': task-1405465, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1923.964013] env[62277]: DEBUG oslo_vmware.exceptions [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1923.964349] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1923.964935] env[62277]: ERROR nova.compute.manager [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1923.964935] env[62277]: Faults: ['InvalidArgument'] [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Traceback (most recent call last): [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] yield resources [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] self.driver.spawn(context, instance, image_meta, [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] self._fetch_image_if_missing(context, vi) [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] image_cache(vi, tmp_image_ds_loc) [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] vm_util.copy_virtual_disk( [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] session._wait_for_task(vmdk_copy_task) [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] return self.wait_for_task(task_ref) [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] return evt.wait() [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] result = hub.switch() [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] return self.greenlet.switch() [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] self.f(*self.args, **self.kw) [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] raise exceptions.translate_fault(task_info.error) [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Faults: ['InvalidArgument'] [ 1923.964935] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] [ 1923.965937] env[62277]: INFO nova.compute.manager [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Terminating instance [ 1923.967027] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1923.967264] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1923.967532] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0b52dc6f-4cc7-4835-8ed9-ff1bd64afbf3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1923.969766] env[62277]: DEBUG nova.compute.manager [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1923.969981] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1923.970759] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e041e20d-4962-49f4-95c2-251da07f0dfa {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1923.977791] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1923.978822] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cd9e32f4-9379-4d87-abf2-ba0b08249695 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1923.980328] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1923.980488] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1923.981185] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1dab8123-b640-460b-b121-0e9b78a17984 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1923.985991] env[62277]: DEBUG oslo_vmware.api [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Waiting for the task: (returnval){ [ 1923.985991] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]522c87e8-6ac6-ca1a-1bd7-238a428a232e" [ 1923.985991] env[62277]: _type = "Task" [ 1923.985991] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1923.993550] env[62277]: DEBUG oslo_vmware.api [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]522c87e8-6ac6-ca1a-1bd7-238a428a232e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1924.351728] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1924.351950] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1924.352152] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Deleting the datastore file [datastore2] 900160c8-a715-45a4-8709-b314fc3216d5 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1924.352430] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fb1ad688-d948-4638-a0f6-19b4c99fe2b9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1924.358568] env[62277]: DEBUG oslo_vmware.api [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Waiting for the task: (returnval){ [ 1924.358568] env[62277]: value = "task-1405467" [ 1924.358568] env[62277]: _type = "Task" [ 1924.358568] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1924.365911] env[62277]: DEBUG oslo_vmware.api [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Task: {'id': task-1405467, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1924.496724] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1924.497081] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Creating directory with path [datastore2] vmware_temp/085cd5ff-4f9d-4bd9-9b5c-5c2048e0c462/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1924.497448] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b50a6141-dce4-4647-8993-4d1e1e8a00cf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1924.508378] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Created directory with path [datastore2] vmware_temp/085cd5ff-4f9d-4bd9-9b5c-5c2048e0c462/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1924.508586] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Fetch image to [datastore2] vmware_temp/085cd5ff-4f9d-4bd9-9b5c-5c2048e0c462/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1924.508762] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/085cd5ff-4f9d-4bd9-9b5c-5c2048e0c462/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1924.509602] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3ba1775-095d-40e9-9946-da8d80546aca {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1924.516052] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0e04329-5d80-47aa-94cc-a75e6efb057b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1924.525192] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1920c945-a2d4-46e2-8da1-e6d83db41bf9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1924.555783] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4986a6b2-9cf5-4165-bff3-4525257b5375 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1924.561497] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-46a6976b-d15c-4369-b146-3d5363cbd176 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1924.582611] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1924.632193] env[62277]: DEBUG oslo_vmware.rw_handles [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/085cd5ff-4f9d-4bd9-9b5c-5c2048e0c462/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1924.691405] env[62277]: DEBUG oslo_vmware.rw_handles [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1924.691677] env[62277]: DEBUG oslo_vmware.rw_handles [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/085cd5ff-4f9d-4bd9-9b5c-5c2048e0c462/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1924.868128] env[62277]: DEBUG oslo_vmware.api [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Task: {'id': task-1405467, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074241} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1924.868376] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1924.868553] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1924.868719] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1924.868888] env[62277]: INFO nova.compute.manager [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Took 0.90 seconds to destroy the instance on the hypervisor. [ 1924.870970] env[62277]: DEBUG nova.compute.claims [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1924.871152] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1924.871365] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1925.131762] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cdd9a0a-f03f-4cad-87c9-f54e293a28cd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.139603] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85c98dc3-107c-4293-8128-a80563542c08 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.169014] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5abe29e-9bab-4f58-bd23-0484d784970f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.176082] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8aaf3b8-69c6-4c04-a38c-bca4d84a92f0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1925.188591] env[62277]: DEBUG nova.compute.provider_tree [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1925.197358] env[62277]: DEBUG nova.scheduler.client.report [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1925.215398] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.343s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1925.215398] env[62277]: ERROR nova.compute.manager [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1925.215398] env[62277]: Faults: ['InvalidArgument'] [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Traceback (most recent call last): [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] self.driver.spawn(context, instance, image_meta, [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] self._fetch_image_if_missing(context, vi) [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] image_cache(vi, tmp_image_ds_loc) [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] vm_util.copy_virtual_disk( [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] session._wait_for_task(vmdk_copy_task) [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] return self.wait_for_task(task_ref) [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] return evt.wait() [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] result = hub.switch() [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] return self.greenlet.switch() [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] self.f(*self.args, **self.kw) [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] raise exceptions.translate_fault(task_info.error) [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Faults: ['InvalidArgument'] [ 1925.215398] env[62277]: ERROR nova.compute.manager [instance: 900160c8-a715-45a4-8709-b314fc3216d5] [ 1925.216239] env[62277]: DEBUG nova.compute.utils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1925.217262] env[62277]: DEBUG nova.compute.manager [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Build of instance 900160c8-a715-45a4-8709-b314fc3216d5 was re-scheduled: A specified parameter was not correct: fileType [ 1925.217262] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1925.217625] env[62277]: DEBUG nova.compute.manager [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1925.217794] env[62277]: DEBUG nova.compute.manager [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1925.217962] env[62277]: DEBUG nova.compute.manager [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1925.218134] env[62277]: DEBUG nova.network.neutron [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1925.799385] env[62277]: DEBUG oslo_concurrency.lockutils [None req-d5c0f938-9ce0-421d-896b-33de135b567b tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquiring lock "b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1925.884415] env[62277]: DEBUG nova.network.neutron [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1925.899875] env[62277]: INFO nova.compute.manager [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Took 0.68 seconds to deallocate network for instance. [ 1926.000343] env[62277]: INFO nova.scheduler.client.report [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Deleted allocations for instance 900160c8-a715-45a4-8709-b314fc3216d5 [ 1926.018974] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79e9c93a-15e2-431d-a1d1-d4cf24af2bea tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Lock "900160c8-a715-45a4-8709-b314fc3216d5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 636.625s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1926.019798] env[62277]: DEBUG oslo_concurrency.lockutils [None req-d6bf19c4-020a-4aca-8044-ada8667f45e5 tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Lock "900160c8-a715-45a4-8709-b314fc3216d5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 440.357s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1926.020031] env[62277]: DEBUG oslo_concurrency.lockutils [None req-d6bf19c4-020a-4aca-8044-ada8667f45e5 tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Acquiring lock "900160c8-a715-45a4-8709-b314fc3216d5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1926.020255] env[62277]: DEBUG oslo_concurrency.lockutils [None req-d6bf19c4-020a-4aca-8044-ada8667f45e5 tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Lock "900160c8-a715-45a4-8709-b314fc3216d5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1926.020448] env[62277]: DEBUG oslo_concurrency.lockutils [None req-d6bf19c4-020a-4aca-8044-ada8667f45e5 tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Lock "900160c8-a715-45a4-8709-b314fc3216d5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1926.022446] env[62277]: INFO nova.compute.manager [None req-d6bf19c4-020a-4aca-8044-ada8667f45e5 tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Terminating instance [ 1926.024043] env[62277]: DEBUG nova.compute.manager [None req-d6bf19c4-020a-4aca-8044-ada8667f45e5 tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1926.024216] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-d6bf19c4-020a-4aca-8044-ada8667f45e5 tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1926.024976] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e02ec39b-1208-40ce-ae4b-4d055cd91025 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.034205] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c08c6bed-16e9-4733-ad05-84692f541ebd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.046523] env[62277]: DEBUG nova.compute.manager [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1926.066644] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-d6bf19c4-020a-4aca-8044-ada8667f45e5 tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 900160c8-a715-45a4-8709-b314fc3216d5 could not be found. [ 1926.066854] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-d6bf19c4-020a-4aca-8044-ada8667f45e5 tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1926.067082] env[62277]: INFO nova.compute.manager [None req-d6bf19c4-020a-4aca-8044-ada8667f45e5 tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1926.067296] env[62277]: DEBUG oslo.service.loopingcall [None req-d6bf19c4-020a-4aca-8044-ada8667f45e5 tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1926.067527] env[62277]: DEBUG nova.compute.manager [-] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1926.067626] env[62277]: DEBUG nova.network.neutron [-] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1926.096048] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1926.096299] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1926.097785] env[62277]: INFO nova.compute.claims [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1926.100952] env[62277]: DEBUG nova.network.neutron [-] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1926.109752] env[62277]: INFO nova.compute.manager [-] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] Took 0.04 seconds to deallocate network for instance. [ 1926.217468] env[62277]: DEBUG oslo_concurrency.lockutils [None req-d6bf19c4-020a-4aca-8044-ada8667f45e5 tempest-InstanceActionsTestJSON-523053051 tempest-InstanceActionsTestJSON-523053051-project-member] Lock "900160c8-a715-45a4-8709-b314fc3216d5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.198s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1926.220796] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "900160c8-a715-45a4-8709-b314fc3216d5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 56.095s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1926.220980] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 900160c8-a715-45a4-8709-b314fc3216d5] During sync_power_state the instance has a pending task (deleting). Skip. [ 1926.221569] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "900160c8-a715-45a4-8709-b314fc3216d5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1926.376116] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f18a9550-ac04-41a7-a0f8-03f37367380b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.384016] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f155b758-1212-48a2-97cc-3a1e3bb257e0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.414705] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46a6492c-2395-45ae-bd8e-9c1d8583dec9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.421306] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7af68dbf-6a7f-41a8-a818-e060c26f409c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.433666] env[62277]: DEBUG nova.compute.provider_tree [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1926.443887] env[62277]: DEBUG nova.scheduler.client.report [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1926.457972] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.362s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1926.458519] env[62277]: DEBUG nova.compute.manager [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1926.489875] env[62277]: DEBUG nova.compute.utils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1926.491635] env[62277]: DEBUG nova.compute.manager [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Not allocating networking since 'none' was specified. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1926.499246] env[62277]: DEBUG nova.compute.manager [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1926.566959] env[62277]: DEBUG nova.compute.manager [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1926.593326] env[62277]: DEBUG nova.virt.hardware [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1926.593575] env[62277]: DEBUG nova.virt.hardware [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1926.593911] env[62277]: DEBUG nova.virt.hardware [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1926.593911] env[62277]: DEBUG nova.virt.hardware [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1926.594054] env[62277]: DEBUG nova.virt.hardware [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1926.594188] env[62277]: DEBUG nova.virt.hardware [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1926.594408] env[62277]: DEBUG nova.virt.hardware [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1926.594558] env[62277]: DEBUG nova.virt.hardware [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1926.594726] env[62277]: DEBUG nova.virt.hardware [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1926.594881] env[62277]: DEBUG nova.virt.hardware [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1926.595065] env[62277]: DEBUG nova.virt.hardware [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1926.595993] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3981e3a-99a8-4a1e-bc2b-1101adf99d3b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.603961] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eda5f710-d9b6-41be-b096-7bfce2680b03 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.617578] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Instance VIF info [] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1926.623108] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Creating folder: Project (38b8a26a2d9c41bd94d44a99b9db21cf). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1926.623393] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7628d478-b657-48d1-9847-fc20ad4d1f55 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.632810] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Created folder: Project (38b8a26a2d9c41bd94d44a99b9db21cf) in parent group-v297781. [ 1926.632995] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Creating folder: Instances. Parent ref: group-v297877. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1926.633229] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fa63cf27-e89c-402d-a880-b333f3333df7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.641033] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Created folder: Instances in parent group-v297877. [ 1926.641266] env[62277]: DEBUG oslo.service.loopingcall [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1926.641447] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1926.641643] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a0e7adbe-7a22-4e2e-bd86-3a2ea5769d9d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1926.657239] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1926.657239] env[62277]: value = "task-1405470" [ 1926.657239] env[62277]: _type = "Task" [ 1926.657239] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1926.664695] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405470, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1927.168810] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405470, 'name': CreateVM_Task, 'duration_secs': 0.232356} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1927.169075] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1927.169419] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1927.169572] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1927.169881] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1927.170135] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-13b71e3f-3c05-45d4-81ba-7f63c40f45ee {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1927.174328] env[62277]: DEBUG oslo_vmware.api [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Waiting for the task: (returnval){ [ 1927.174328] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52806f98-dcee-707a-14af-be91bd3df913" [ 1927.174328] env[62277]: _type = "Task" [ 1927.174328] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1927.181529] env[62277]: DEBUG oslo_vmware.api [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52806f98-dcee-707a-14af-be91bd3df913, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1927.684134] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1927.684428] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1927.684638] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1929.062547] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Acquiring lock "8b9ef530-e79f-4cd4-8a88-83871ed65f90" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1946.169804] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1948.164818] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1948.168430] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1949.169278] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1950.168981] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1950.181244] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1950.181543] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1950.181670] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1950.181853] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1950.183571] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46101fa5-155e-4811-bb12-f75d37b818db {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1950.192509] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afe142c9-f6f9-4def-9397-a681fadab817 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1950.207875] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc657afc-f79a-4000-b5ee-7ed99bc8c2d2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1950.214892] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5df71f4a-f34a-4863-8d84-6a1c8fa2f1a3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1950.248913] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181313MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1950.249121] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1950.249368] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1950.324403] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 63267d5c-d004-41c1-866a-75b9e37521b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1950.324609] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 13959890-87a1-45ba-98de-621373e265e7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1950.324750] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance a7cc7e45-8567-4699-af83-624b1c7c5c64 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1950.324906] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 42005809-1926-44b2-8ef6-3b6cb28a4020 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1950.325057] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1950.325213] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1950.325362] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 163eb4e7-33f8-4674-8a3f-5094356e250d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1950.325494] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 400beb27-a709-4ef4-851e-5caaab9ca60b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1950.325644] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9f7d4431-d5ea-4f9b-888b-77a6a7772047 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1950.325768] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8b9ef530-e79f-4cd4-8a88-83871ed65f90 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1950.337127] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c4c22c8a-4300-45ce-8484-77c638f7bbc5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1950.348814] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 97a1dad9-a665-42ca-b85e-5fef59ab80bf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1950.359431] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 7748e3a1-6adc-4482-90f9-a3816a224272 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1950.368896] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 71f172bf-94bd-4027-bbae-f3bd3e1f91c3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1950.378088] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance ad83bf06-d712-4bd4-8086-9c3b615adaf5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1950.388301] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance e0c4112c-6bc6-44e4-8e43-fda8203bf1c3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1950.396909] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 81c79e22-aaa3-45dc-967a-b4a884f692eb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1950.406081] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance de543a46-26c3-40b3-9ccd-80bb1f9845d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1950.406339] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1950.406464] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1950.598178] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad354677-1195-406b-8aa0-63e35042cf61 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1950.605877] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e46978ab-bb48-4508-a606-a24e4df615dc {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1950.635316] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3038001f-aece-4aaa-8928-2e3d2aad80c2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1950.641804] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20ca6479-cc66-4989-b8c2-a2b70b14fc0c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1950.654185] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1950.662342] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1950.675032] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1950.675210] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.426s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1951.675334] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1951.675590] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 1951.675687] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 1951.696626] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1951.696821] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 13959890-87a1-45ba-98de-621373e265e7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1951.696978] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1951.697145] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1951.697357] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1951.697550] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1951.697723] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1951.697858] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1951.698016] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1951.698158] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 1951.698303] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 1951.698824] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1953.168580] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1953.188769] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1956.168944] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1956.169241] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 1972.834307] env[62277]: WARNING oslo_vmware.rw_handles [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1972.834307] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1972.834307] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1972.834307] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1972.834307] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1972.834307] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 1972.834307] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1972.834307] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1972.834307] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1972.834307] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1972.834307] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1972.834307] env[62277]: ERROR oslo_vmware.rw_handles [ 1972.835159] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/085cd5ff-4f9d-4bd9-9b5c-5c2048e0c462/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1972.837825] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1972.838092] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Copying Virtual Disk [datastore2] vmware_temp/085cd5ff-4f9d-4bd9-9b5c-5c2048e0c462/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/085cd5ff-4f9d-4bd9-9b5c-5c2048e0c462/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1972.838393] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9ef7c798-7ca0-4fac-99dd-9c9f031467a4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1972.847965] env[62277]: DEBUG oslo_vmware.api [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Waiting for the task: (returnval){ [ 1972.847965] env[62277]: value = "task-1405471" [ 1972.847965] env[62277]: _type = "Task" [ 1972.847965] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1972.855974] env[62277]: DEBUG oslo_vmware.api [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Task: {'id': task-1405471, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1973.357904] env[62277]: DEBUG oslo_vmware.exceptions [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1973.358791] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1973.358960] env[62277]: ERROR nova.compute.manager [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1973.358960] env[62277]: Faults: ['InvalidArgument'] [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Traceback (most recent call last): [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] yield resources [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] self.driver.spawn(context, instance, image_meta, [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] self._fetch_image_if_missing(context, vi) [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] image_cache(vi, tmp_image_ds_loc) [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] vm_util.copy_virtual_disk( [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] session._wait_for_task(vmdk_copy_task) [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] return self.wait_for_task(task_ref) [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] return evt.wait() [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] result = hub.switch() [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] return self.greenlet.switch() [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] self.f(*self.args, **self.kw) [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] raise exceptions.translate_fault(task_info.error) [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Faults: ['InvalidArgument'] [ 1973.358960] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] [ 1973.359906] env[62277]: INFO nova.compute.manager [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Terminating instance [ 1973.360870] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1973.361086] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1973.361334] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-df5dd3f6-d296-4cb5-b6a7-91a5ad680d11 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.363509] env[62277]: DEBUG nova.compute.manager [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1973.363703] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1973.364436] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78e71a62-2f5b-4c26-90dd-e3e3a9ba68af {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.371315] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1973.371527] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-37eae367-bea4-4fb8-b0ea-160ecdd0e588 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.373622] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1973.373792] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1973.374718] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bdc05f3f-ce43-4610-83ae-eb1ddb73a54b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.379310] env[62277]: DEBUG oslo_vmware.api [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Waiting for the task: (returnval){ [ 1973.379310] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]529400ea-30e8-19d3-d3f5-33532ab0d715" [ 1973.379310] env[62277]: _type = "Task" [ 1973.379310] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1973.386302] env[62277]: DEBUG oslo_vmware.api [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]529400ea-30e8-19d3-d3f5-33532ab0d715, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1973.435325] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1973.435667] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1973.435887] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Deleting the datastore file [datastore2] 63267d5c-d004-41c1-866a-75b9e37521b7 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1973.436172] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7ddfcb5b-8b24-4140-b56d-c693228a095c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.442631] env[62277]: DEBUG oslo_vmware.api [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Waiting for the task: (returnval){ [ 1973.442631] env[62277]: value = "task-1405473" [ 1973.442631] env[62277]: _type = "Task" [ 1973.442631] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1973.450620] env[62277]: DEBUG oslo_vmware.api [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Task: {'id': task-1405473, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1973.891316] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1973.891316] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Creating directory with path [datastore2] vmware_temp/3c20ceea-36cc-48ef-87bc-31324d949126/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1973.891316] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a6602d9d-d777-4ed8-a5fa-0bd1f4d2dff0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.904466] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Created directory with path [datastore2] vmware_temp/3c20ceea-36cc-48ef-87bc-31324d949126/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1973.904675] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Fetch image to [datastore2] vmware_temp/3c20ceea-36cc-48ef-87bc-31324d949126/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1973.904949] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/3c20ceea-36cc-48ef-87bc-31324d949126/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1973.905639] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b56f9a3b-c5e3-4e98-bfe0-1112e076b8b4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.912415] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afa476bc-a133-4789-9edc-733ebe1929bc {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.921462] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-807706e1-685e-44d1-9990-f201c750c170 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.954688] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d172ab12-e1f5-4986-9501-54be338db4d0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.961587] env[62277]: DEBUG oslo_vmware.api [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Task: {'id': task-1405473, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076673} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1973.963010] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1973.963209] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1973.963373] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1973.963541] env[62277]: INFO nova.compute.manager [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1973.965267] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5502a308-9ab1-4ba7-bb07-c064cfcbc255 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1973.967069] env[62277]: DEBUG nova.compute.claims [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1973.967243] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1973.967450] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1973.988975] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1974.041152] env[62277]: DEBUG oslo_vmware.rw_handles [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3c20ceea-36cc-48ef-87bc-31324d949126/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1974.102414] env[62277]: DEBUG oslo_vmware.rw_handles [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1974.102637] env[62277]: DEBUG oslo_vmware.rw_handles [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3c20ceea-36cc-48ef-87bc-31324d949126/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1974.269164] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ef42c79-484b-474f-9e5e-8b821ffc0b0b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.277011] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59a6923e-04e2-423b-889b-a1dd2a90fcb7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.307940] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80bcb7bc-af0a-461c-9b8e-76806b2bdbbe {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.315253] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42ddf02c-4ace-4b0b-a84e-6312ea159826 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.329856] env[62277]: DEBUG nova.compute.provider_tree [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1974.337271] env[62277]: DEBUG nova.scheduler.client.report [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1974.351433] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.384s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1974.351983] env[62277]: ERROR nova.compute.manager [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1974.351983] env[62277]: Faults: ['InvalidArgument'] [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Traceback (most recent call last): [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] self.driver.spawn(context, instance, image_meta, [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] self._fetch_image_if_missing(context, vi) [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] image_cache(vi, tmp_image_ds_loc) [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] vm_util.copy_virtual_disk( [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] session._wait_for_task(vmdk_copy_task) [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] return self.wait_for_task(task_ref) [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] return evt.wait() [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] result = hub.switch() [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] return self.greenlet.switch() [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] self.f(*self.args, **self.kw) [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] raise exceptions.translate_fault(task_info.error) [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Faults: ['InvalidArgument'] [ 1974.351983] env[62277]: ERROR nova.compute.manager [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] [ 1974.352883] env[62277]: DEBUG nova.compute.utils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1974.354129] env[62277]: DEBUG nova.compute.manager [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Build of instance 63267d5c-d004-41c1-866a-75b9e37521b7 was re-scheduled: A specified parameter was not correct: fileType [ 1974.354129] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1974.354492] env[62277]: DEBUG nova.compute.manager [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1974.354676] env[62277]: DEBUG nova.compute.manager [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1974.354846] env[62277]: DEBUG nova.compute.manager [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1974.355012] env[62277]: DEBUG nova.network.neutron [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1974.759727] env[62277]: DEBUG nova.network.neutron [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1974.774641] env[62277]: INFO nova.compute.manager [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Took 0.42 seconds to deallocate network for instance. [ 1974.874071] env[62277]: INFO nova.scheduler.client.report [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Deleted allocations for instance 63267d5c-d004-41c1-866a-75b9e37521b7 [ 1974.897821] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c5d9193-e753-43be-b6c5-051c0d2bbd67 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Lock "63267d5c-d004-41c1-866a-75b9e37521b7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 637.828s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1974.900077] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b859558a-63e9-4c03-88ff-71a69ecfbcb8 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Lock "63267d5c-d004-41c1-866a-75b9e37521b7" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 442.301s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1974.900077] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b859558a-63e9-4c03-88ff-71a69ecfbcb8 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Acquiring lock "63267d5c-d004-41c1-866a-75b9e37521b7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1974.900077] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b859558a-63e9-4c03-88ff-71a69ecfbcb8 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Lock "63267d5c-d004-41c1-866a-75b9e37521b7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1974.900077] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b859558a-63e9-4c03-88ff-71a69ecfbcb8 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Lock "63267d5c-d004-41c1-866a-75b9e37521b7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1974.902033] env[62277]: INFO nova.compute.manager [None req-b859558a-63e9-4c03-88ff-71a69ecfbcb8 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Terminating instance [ 1974.904025] env[62277]: DEBUG nova.compute.manager [None req-b859558a-63e9-4c03-88ff-71a69ecfbcb8 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1974.904025] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b859558a-63e9-4c03-88ff-71a69ecfbcb8 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1974.904558] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f70f3432-dc96-4ab9-9ecd-b253aadad30f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.915251] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6cd1089-37db-4aed-8fec-301aa94a25bb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.926238] env[62277]: DEBUG nova.compute.manager [None req-986881fe-22f5-4969-b560-0c069535f231 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 9bf5cf30-5142-4d51-b5d9-1bbfb9eedce8] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1974.946517] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-b859558a-63e9-4c03-88ff-71a69ecfbcb8 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 63267d5c-d004-41c1-866a-75b9e37521b7 could not be found. [ 1974.946639] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b859558a-63e9-4c03-88ff-71a69ecfbcb8 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1974.946785] env[62277]: INFO nova.compute.manager [None req-b859558a-63e9-4c03-88ff-71a69ecfbcb8 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1974.947045] env[62277]: DEBUG oslo.service.loopingcall [None req-b859558a-63e9-4c03-88ff-71a69ecfbcb8 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1974.947292] env[62277]: DEBUG nova.compute.manager [-] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1974.947380] env[62277]: DEBUG nova.network.neutron [-] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1974.954297] env[62277]: DEBUG nova.compute.manager [None req-986881fe-22f5-4969-b560-0c069535f231 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 9bf5cf30-5142-4d51-b5d9-1bbfb9eedce8] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1974.977845] env[62277]: DEBUG oslo_concurrency.lockutils [None req-986881fe-22f5-4969-b560-0c069535f231 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Lock "9bf5cf30-5142-4d51-b5d9-1bbfb9eedce8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 242.740s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1974.986146] env[62277]: DEBUG nova.network.neutron [-] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1974.994950] env[62277]: DEBUG nova.compute.manager [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1974.998739] env[62277]: INFO nova.compute.manager [-] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] Took 0.05 seconds to deallocate network for instance. [ 1975.046826] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1975.046826] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1975.047532] env[62277]: INFO nova.compute.claims [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1975.085686] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b859558a-63e9-4c03-88ff-71a69ecfbcb8 tempest-ServerActionsTestJSON-466026132 tempest-ServerActionsTestJSON-466026132-project-member] Lock "63267d5c-d004-41c1-866a-75b9e37521b7" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.187s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1975.087024] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "63267d5c-d004-41c1-866a-75b9e37521b7" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 104.961s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1975.087024] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 63267d5c-d004-41c1-866a-75b9e37521b7] During sync_power_state the instance has a pending task (deleting). Skip. [ 1975.087164] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "63267d5c-d004-41c1-866a-75b9e37521b7" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1975.273456] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-645b0a01-bf00-40f7-ad88-b11b049701d6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.280930] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9502430a-945e-46b5-8026-f3f3ed88ac12 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.310133] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78b81c62-6011-4a2f-b728-1c8a1f28ca7e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.317178] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d94160a0-182f-46cc-b666-d46ee4ce8e60 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.329777] env[62277]: DEBUG nova.compute.provider_tree [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1975.339834] env[62277]: DEBUG nova.scheduler.client.report [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1975.352266] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.306s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1975.352707] env[62277]: DEBUG nova.compute.manager [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1975.385393] env[62277]: DEBUG nova.compute.utils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1975.386705] env[62277]: DEBUG nova.compute.manager [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1975.386955] env[62277]: DEBUG nova.network.neutron [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1975.395203] env[62277]: DEBUG nova.compute.manager [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1975.448142] env[62277]: DEBUG nova.policy [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '696edb47b3844d7499217e84fcf42619', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e7e15898bc784416bdc7fa9a9423726f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 1975.469056] env[62277]: DEBUG nova.compute.manager [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1975.494845] env[62277]: DEBUG nova.virt.hardware [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1975.495111] env[62277]: DEBUG nova.virt.hardware [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1975.495269] env[62277]: DEBUG nova.virt.hardware [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1975.495724] env[62277]: DEBUG nova.virt.hardware [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1975.495724] env[62277]: DEBUG nova.virt.hardware [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1975.495724] env[62277]: DEBUG nova.virt.hardware [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1975.495903] env[62277]: DEBUG nova.virt.hardware [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1975.496070] env[62277]: DEBUG nova.virt.hardware [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1975.496230] env[62277]: DEBUG nova.virt.hardware [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1975.496386] env[62277]: DEBUG nova.virt.hardware [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1975.496551] env[62277]: DEBUG nova.virt.hardware [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1975.497438] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47846af8-9c49-4686-94d6-0029e78af8e5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.506852] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10eb7afa-28b9-49b2-9afe-9a83cb130dd4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.783894] env[62277]: DEBUG nova.network.neutron [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Successfully created port: 152fd1b4-70e1-45c5-968a-b6938ee4b6d4 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1976.838373] env[62277]: DEBUG nova.compute.manager [req-24a9f473-8739-44d5-8b9a-9addc4c2e434 req-f9ed8347-de60-4901-b6ef-e61e7c52a263 service nova] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Received event network-vif-plugged-152fd1b4-70e1-45c5-968a-b6938ee4b6d4 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1976.838600] env[62277]: DEBUG oslo_concurrency.lockutils [req-24a9f473-8739-44d5-8b9a-9addc4c2e434 req-f9ed8347-de60-4901-b6ef-e61e7c52a263 service nova] Acquiring lock "c4c22c8a-4300-45ce-8484-77c638f7bbc5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1976.838799] env[62277]: DEBUG oslo_concurrency.lockutils [req-24a9f473-8739-44d5-8b9a-9addc4c2e434 req-f9ed8347-de60-4901-b6ef-e61e7c52a263 service nova] Lock "c4c22c8a-4300-45ce-8484-77c638f7bbc5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1976.838964] env[62277]: DEBUG oslo_concurrency.lockutils [req-24a9f473-8739-44d5-8b9a-9addc4c2e434 req-f9ed8347-de60-4901-b6ef-e61e7c52a263 service nova] Lock "c4c22c8a-4300-45ce-8484-77c638f7bbc5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1976.839333] env[62277]: DEBUG nova.compute.manager [req-24a9f473-8739-44d5-8b9a-9addc4c2e434 req-f9ed8347-de60-4901-b6ef-e61e7c52a263 service nova] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] No waiting events found dispatching network-vif-plugged-152fd1b4-70e1-45c5-968a-b6938ee4b6d4 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1976.839572] env[62277]: WARNING nova.compute.manager [req-24a9f473-8739-44d5-8b9a-9addc4c2e434 req-f9ed8347-de60-4901-b6ef-e61e7c52a263 service nova] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Received unexpected event network-vif-plugged-152fd1b4-70e1-45c5-968a-b6938ee4b6d4 for instance with vm_state building and task_state spawning. [ 1976.846697] env[62277]: DEBUG nova.network.neutron [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Successfully updated port: 152fd1b4-70e1-45c5-968a-b6938ee4b6d4 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1976.860529] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "refresh_cache-c4c22c8a-4300-45ce-8484-77c638f7bbc5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1976.860529] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquired lock "refresh_cache-c4c22c8a-4300-45ce-8484-77c638f7bbc5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1976.860529] env[62277]: DEBUG nova.network.neutron [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1976.898465] env[62277]: DEBUG nova.network.neutron [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1977.131828] env[62277]: DEBUG nova.network.neutron [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Updating instance_info_cache with network_info: [{"id": "152fd1b4-70e1-45c5-968a-b6938ee4b6d4", "address": "fa:16:3e:3c:12:f1", "network": {"id": "7efa6c69-4ed6-4615-b77a-53d6e045efc5", "bridge": "br-int", "label": "tempest-ServersTestJSON-132086172-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e7e15898bc784416bdc7fa9a9423726f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b8137fc-f23d-49b1-b19c-3123a5588f34", "external-id": "nsx-vlan-transportzone-709", "segmentation_id": 709, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap152fd1b4-70", "ovs_interfaceid": "152fd1b4-70e1-45c5-968a-b6938ee4b6d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1977.146131] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Releasing lock "refresh_cache-c4c22c8a-4300-45ce-8484-77c638f7bbc5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1977.146438] env[62277]: DEBUG nova.compute.manager [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Instance network_info: |[{"id": "152fd1b4-70e1-45c5-968a-b6938ee4b6d4", "address": "fa:16:3e:3c:12:f1", "network": {"id": "7efa6c69-4ed6-4615-b77a-53d6e045efc5", "bridge": "br-int", "label": "tempest-ServersTestJSON-132086172-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e7e15898bc784416bdc7fa9a9423726f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b8137fc-f23d-49b1-b19c-3123a5588f34", "external-id": "nsx-vlan-transportzone-709", "segmentation_id": 709, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap152fd1b4-70", "ovs_interfaceid": "152fd1b4-70e1-45c5-968a-b6938ee4b6d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1977.146829] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3c:12:f1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6b8137fc-f23d-49b1-b19c-3123a5588f34', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '152fd1b4-70e1-45c5-968a-b6938ee4b6d4', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1977.154367] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Creating folder: Project (e7e15898bc784416bdc7fa9a9423726f). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1977.154846] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9ddc39f5-9392-4236-a472-81fa73f8e079 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1977.166937] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Created folder: Project (e7e15898bc784416bdc7fa9a9423726f) in parent group-v297781. [ 1977.167131] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Creating folder: Instances. Parent ref: group-v297880. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1977.167347] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8d472629-f644-43dc-a306-2b6374e5ccd6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1977.175813] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Created folder: Instances in parent group-v297880. [ 1977.176040] env[62277]: DEBUG oslo.service.loopingcall [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1977.176224] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1977.176413] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-26b1cf9c-f940-4758-b677-4bbb8449e91a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1977.195957] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1977.195957] env[62277]: value = "task-1405476" [ 1977.195957] env[62277]: _type = "Task" [ 1977.195957] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1977.203412] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405476, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1977.705620] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405476, 'name': CreateVM_Task, 'duration_secs': 0.277177} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1977.705790] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1977.706467] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1977.706630] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1977.706943] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1977.707203] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1a1d75d8-d516-4b11-9f64-c287d4ed933d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1977.711582] env[62277]: DEBUG oslo_vmware.api [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Waiting for the task: (returnval){ [ 1977.711582] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5298a96d-0a5b-7f7e-d4ee-2c137017be67" [ 1977.711582] env[62277]: _type = "Task" [ 1977.711582] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1977.718622] env[62277]: DEBUG oslo_vmware.api [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5298a96d-0a5b-7f7e-d4ee-2c137017be67, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1978.224963] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1978.225355] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1978.225459] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1978.864461] env[62277]: DEBUG nova.compute.manager [req-a5fece4d-4753-4cb9-93c2-4810637af172 req-d31c661e-4d54-4f24-a6fa-8a49ddfb4d24 service nova] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Received event network-changed-152fd1b4-70e1-45c5-968a-b6938ee4b6d4 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 1978.864622] env[62277]: DEBUG nova.compute.manager [req-a5fece4d-4753-4cb9-93c2-4810637af172 req-d31c661e-4d54-4f24-a6fa-8a49ddfb4d24 service nova] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Refreshing instance network info cache due to event network-changed-152fd1b4-70e1-45c5-968a-b6938ee4b6d4. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 1978.864812] env[62277]: DEBUG oslo_concurrency.lockutils [req-a5fece4d-4753-4cb9-93c2-4810637af172 req-d31c661e-4d54-4f24-a6fa-8a49ddfb4d24 service nova] Acquiring lock "refresh_cache-c4c22c8a-4300-45ce-8484-77c638f7bbc5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1978.864952] env[62277]: DEBUG oslo_concurrency.lockutils [req-a5fece4d-4753-4cb9-93c2-4810637af172 req-d31c661e-4d54-4f24-a6fa-8a49ddfb4d24 service nova] Acquired lock "refresh_cache-c4c22c8a-4300-45ce-8484-77c638f7bbc5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1978.865153] env[62277]: DEBUG nova.network.neutron [req-a5fece4d-4753-4cb9-93c2-4810637af172 req-d31c661e-4d54-4f24-a6fa-8a49ddfb4d24 service nova] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Refreshing network info cache for port 152fd1b4-70e1-45c5-968a-b6938ee4b6d4 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1979.360391] env[62277]: DEBUG nova.network.neutron [req-a5fece4d-4753-4cb9-93c2-4810637af172 req-d31c661e-4d54-4f24-a6fa-8a49ddfb4d24 service nova] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Updated VIF entry in instance network info cache for port 152fd1b4-70e1-45c5-968a-b6938ee4b6d4. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1979.360783] env[62277]: DEBUG nova.network.neutron [req-a5fece4d-4753-4cb9-93c2-4810637af172 req-d31c661e-4d54-4f24-a6fa-8a49ddfb4d24 service nova] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Updating instance_info_cache with network_info: [{"id": "152fd1b4-70e1-45c5-968a-b6938ee4b6d4", "address": "fa:16:3e:3c:12:f1", "network": {"id": "7efa6c69-4ed6-4615-b77a-53d6e045efc5", "bridge": "br-int", "label": "tempest-ServersTestJSON-132086172-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e7e15898bc784416bdc7fa9a9423726f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b8137fc-f23d-49b1-b19c-3123a5588f34", "external-id": "nsx-vlan-transportzone-709", "segmentation_id": 709, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap152fd1b4-70", "ovs_interfaceid": "152fd1b4-70e1-45c5-968a-b6938ee4b6d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1979.371523] env[62277]: DEBUG oslo_concurrency.lockutils [req-a5fece4d-4753-4cb9-93c2-4810637af172 req-d31c661e-4d54-4f24-a6fa-8a49ddfb4d24 service nova] Releasing lock "refresh_cache-c4c22c8a-4300-45ce-8484-77c638f7bbc5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1980.503899] env[62277]: DEBUG oslo_concurrency.lockutils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "297d53df-7918-4389-9c63-a600755da969" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1980.504208] env[62277]: DEBUG oslo_concurrency.lockutils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "297d53df-7918-4389-9c63-a600755da969" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2004.191307] env[62277]: DEBUG oslo_concurrency.lockutils [None req-231d0e7a-82be-4285-b16a-80974a83b196 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "c4c22c8a-4300-45ce-8484-77c638f7bbc5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2007.169758] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2009.169316] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2010.165050] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2010.168502] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2010.168689] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2010.180046] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2010.180363] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2010.180428] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2010.180550] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2010.181682] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-791020c9-89b0-4be3-b973-17fef26ff84c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.190092] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee263fa8-d680-46d1-81ac-1356b878513d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.203794] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d663adb-5b3a-4c12-979d-9b82a538b972 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.210046] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7d68e99-9e51-40a8-b4d6-9ad31752f190 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.239073] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181428MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2010.239256] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2010.239413] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2010.316570] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 13959890-87a1-45ba-98de-621373e265e7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2010.316731] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance a7cc7e45-8567-4699-af83-624b1c7c5c64 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2010.316861] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 42005809-1926-44b2-8ef6-3b6cb28a4020 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2010.316989] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2010.317125] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2010.317245] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 163eb4e7-33f8-4674-8a3f-5094356e250d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2010.317371] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 400beb27-a709-4ef4-851e-5caaab9ca60b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2010.317489] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9f7d4431-d5ea-4f9b-888b-77a6a7772047 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2010.317601] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8b9ef530-e79f-4cd4-8a88-83871ed65f90 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2010.317713] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c4c22c8a-4300-45ce-8484-77c638f7bbc5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2010.328199] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 97a1dad9-a665-42ca-b85e-5fef59ab80bf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2010.338114] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 7748e3a1-6adc-4482-90f9-a3816a224272 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2010.348159] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 71f172bf-94bd-4027-bbae-f3bd3e1f91c3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2010.358196] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance ad83bf06-d712-4bd4-8086-9c3b615adaf5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2010.366862] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance e0c4112c-6bc6-44e4-8e43-fda8203bf1c3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2010.375469] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 81c79e22-aaa3-45dc-967a-b4a884f692eb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2010.384237] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance de543a46-26c3-40b3-9ccd-80bb1f9845d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2010.392619] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 297d53df-7918-4389-9c63-a600755da969 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2010.392836] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2010.393016] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2010.581312] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e50e2f82-5fd8-4121-8232-f8841f9f770c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.589085] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bf2cf35-4ad6-4661-953f-b1ec82905057 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.619292] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0eab6ded-c2df-4932-85ca-6703d5765e50 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.626499] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7dd25429-a19e-4715-888b-020c018ec18c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2010.638964] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2010.646728] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2010.659658] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2010.659882] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.420s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2011.660349] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2011.660614] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 2011.660653] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 2011.681991] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 13959890-87a1-45ba-98de-621373e265e7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2011.682224] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2011.682364] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2011.682488] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2011.682608] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2011.682727] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2011.682844] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2011.682962] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2011.683091] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2011.683208] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2011.683326] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 2013.169464] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2013.169464] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2014.052049] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Acquiring lock "940561d5-723b-4e43-8fab-35e8af95ce09" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2014.052049] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Lock "940561d5-723b-4e43-8fab-35e8af95ce09" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2017.169068] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2017.169453] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 2022.854059] env[62277]: WARNING oslo_vmware.rw_handles [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2022.854059] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2022.854059] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2022.854059] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2022.854059] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2022.854059] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 2022.854059] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2022.854059] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2022.854059] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2022.854059] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2022.854059] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2022.854059] env[62277]: ERROR oslo_vmware.rw_handles [ 2022.854675] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/3c20ceea-36cc-48ef-87bc-31324d949126/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2022.856670] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2022.856967] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Copying Virtual Disk [datastore2] vmware_temp/3c20ceea-36cc-48ef-87bc-31324d949126/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/3c20ceea-36cc-48ef-87bc-31324d949126/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2022.857329] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b21c058c-b633-4bdb-b0ec-e844f328a529 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2022.865015] env[62277]: DEBUG oslo_vmware.api [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Waiting for the task: (returnval){ [ 2022.865015] env[62277]: value = "task-1405477" [ 2022.865015] env[62277]: _type = "Task" [ 2022.865015] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2022.872910] env[62277]: DEBUG oslo_vmware.api [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Task: {'id': task-1405477, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2023.376192] env[62277]: DEBUG oslo_vmware.exceptions [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2023.377025] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2023.377025] env[62277]: ERROR nova.compute.manager [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2023.377025] env[62277]: Faults: ['InvalidArgument'] [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] Traceback (most recent call last): [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] yield resources [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self.driver.spawn(context, instance, image_meta, [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self._fetch_image_if_missing(context, vi) [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] image_cache(vi, tmp_image_ds_loc) [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] vm_util.copy_virtual_disk( [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] session._wait_for_task(vmdk_copy_task) [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] return self.wait_for_task(task_ref) [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] return evt.wait() [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] result = hub.switch() [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] return self.greenlet.switch() [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self.f(*self.args, **self.kw) [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] raise exceptions.translate_fault(task_info.error) [ 2023.377025] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2023.378380] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] Faults: ['InvalidArgument'] [ 2023.378380] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] [ 2023.378380] env[62277]: INFO nova.compute.manager [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Terminating instance [ 2023.378795] env[62277]: DEBUG oslo_concurrency.lockutils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2023.379028] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2023.379644] env[62277]: DEBUG nova.compute.manager [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2023.379828] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2023.380121] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-96550d60-242f-4e7f-b157-aa23a836eca3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.382463] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb3d8248-b19d-4db9-ae69-f56169e75700 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.390791] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2023.391872] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ce0305d1-82cc-47e0-839d-8762afd8f2f0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.393398] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2023.393567] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2023.394254] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7c0212d4-f186-4760-8c2c-135dc491bc9d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.399426] env[62277]: DEBUG oslo_vmware.api [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Waiting for the task: (returnval){ [ 2023.399426] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52b3ba81-be7e-ccb0-2af7-6bdfb1cc078c" [ 2023.399426] env[62277]: _type = "Task" [ 2023.399426] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2023.409240] env[62277]: DEBUG oslo_vmware.api [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52b3ba81-be7e-ccb0-2af7-6bdfb1cc078c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2023.466895] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2023.467110] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2023.467291] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Deleting the datastore file [datastore2] 13959890-87a1-45ba-98de-621373e265e7 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2023.467552] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9c8dfbf7-7562-4cd9-bae1-88b0fd7057e6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.473581] env[62277]: DEBUG oslo_vmware.api [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Waiting for the task: (returnval){ [ 2023.473581] env[62277]: value = "task-1405479" [ 2023.473581] env[62277]: _type = "Task" [ 2023.473581] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2023.480971] env[62277]: DEBUG oslo_vmware.api [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Task: {'id': task-1405479, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2023.910212] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2023.910487] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Creating directory with path [datastore2] vmware_temp/034a623b-5839-466c-8b51-16a909e69992/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2023.910718] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a2c4a127-0599-4e38-b9ae-8b8da2c09461 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.921550] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Created directory with path [datastore2] vmware_temp/034a623b-5839-466c-8b51-16a909e69992/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2023.921720] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Fetch image to [datastore2] vmware_temp/034a623b-5839-466c-8b51-16a909e69992/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2023.921880] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/034a623b-5839-466c-8b51-16a909e69992/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2023.922574] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71e25313-2acb-4c95-8d26-d2ffb614bb08 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.928752] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f24c6a28-27a8-405a-98a0-3645ff7d86b6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.937641] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3873d02a-a1a3-48b8-a7c8-8b8b29a0dd96 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.967194] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73e92647-a73f-4b7c-94ba-38871aef7989 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.972639] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ca78a0f3-ba10-426f-9e9a-27348a7b00ea {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2023.981455] env[62277]: DEBUG oslo_vmware.api [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Task: {'id': task-1405479, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076607} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2023.981670] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2023.981866] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2023.982031] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2023.982205] env[62277]: INFO nova.compute.manager [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2023.984250] env[62277]: DEBUG nova.compute.claims [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2023.984432] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2023.984641] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2023.995231] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2024.045011] env[62277]: DEBUG oslo_vmware.rw_handles [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/034a623b-5839-466c-8b51-16a909e69992/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2024.107368] env[62277]: DEBUG oslo_vmware.rw_handles [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2024.107637] env[62277]: DEBUG oslo_vmware.rw_handles [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/034a623b-5839-466c-8b51-16a909e69992/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2024.244528] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54dc4bda-a1bf-485e-90b4-d962b509ca82 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2024.251938] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a13e18dd-7b3a-470e-b6b9-5a7208c1c656 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2024.281581] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0897340-8992-406c-8e52-afa01c7ffbee {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2024.288334] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a20c92eb-db32-4b28-ac55-e5887d498e5a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2024.300884] env[62277]: DEBUG nova.compute.provider_tree [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2024.308773] env[62277]: DEBUG nova.scheduler.client.report [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2024.324318] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.340s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2024.324817] env[62277]: ERROR nova.compute.manager [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2024.324817] env[62277]: Faults: ['InvalidArgument'] [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] Traceback (most recent call last): [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self.driver.spawn(context, instance, image_meta, [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self._fetch_image_if_missing(context, vi) [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] image_cache(vi, tmp_image_ds_loc) [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] vm_util.copy_virtual_disk( [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] session._wait_for_task(vmdk_copy_task) [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] return self.wait_for_task(task_ref) [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] return evt.wait() [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] result = hub.switch() [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] return self.greenlet.switch() [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self.f(*self.args, **self.kw) [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] raise exceptions.translate_fault(task_info.error) [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] Faults: ['InvalidArgument'] [ 2024.324817] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] [ 2024.325630] env[62277]: DEBUG nova.compute.utils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2024.326932] env[62277]: DEBUG nova.compute.manager [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Build of instance 13959890-87a1-45ba-98de-621373e265e7 was re-scheduled: A specified parameter was not correct: fileType [ 2024.326932] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2024.327341] env[62277]: DEBUG nova.compute.manager [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2024.327476] env[62277]: DEBUG nova.compute.manager [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2024.327628] env[62277]: DEBUG nova.compute.manager [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2024.327783] env[62277]: DEBUG nova.network.neutron [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2024.464195] env[62277]: DEBUG neutronclient.v2_0.client [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=62277) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 2024.465261] env[62277]: ERROR nova.compute.manager [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] Traceback (most recent call last): [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self.driver.spawn(context, instance, image_meta, [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self._fetch_image_if_missing(context, vi) [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] image_cache(vi, tmp_image_ds_loc) [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] vm_util.copy_virtual_disk( [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] session._wait_for_task(vmdk_copy_task) [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] return self.wait_for_task(task_ref) [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] return evt.wait() [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] result = hub.switch() [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] return self.greenlet.switch() [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self.f(*self.args, **self.kw) [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] raise exceptions.translate_fault(task_info.error) [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] Faults: ['InvalidArgument'] [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] During handling of the above exception, another exception occurred: [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] Traceback (most recent call last): [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 2024.465261] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self._build_and_run_instance(context, instance, image, [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] raise exception.RescheduledException( [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] nova.exception.RescheduledException: Build of instance 13959890-87a1-45ba-98de-621373e265e7 was re-scheduled: A specified parameter was not correct: fileType [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] Faults: ['InvalidArgument'] [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] During handling of the above exception, another exception occurred: [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] Traceback (most recent call last): [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] ret = obj(*args, **kwargs) [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] exception_handler_v20(status_code, error_body) [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] raise client_exc(message=error_message, [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] Neutron server returns request_ids: ['req-af9b43a1-ae30-40ed-ba81-6c7f947fac6a'] [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] During handling of the above exception, another exception occurred: [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] Traceback (most recent call last): [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self._deallocate_network(context, instance, requested_networks) [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self.network_api.deallocate_for_instance( [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] data = neutron.list_ports(**search_opts) [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] ret = obj(*args, **kwargs) [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] return self.list('ports', self.ports_path, retrieve_all, [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] ret = obj(*args, **kwargs) [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 2024.466329] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] for r in self._pagination(collection, path, **params): [ 2024.467391] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 2024.467391] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] res = self.get(path, params=params) [ 2024.467391] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.467391] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] ret = obj(*args, **kwargs) [ 2024.467391] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 2024.467391] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] return self.retry_request("GET", action, body=body, [ 2024.467391] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.467391] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] ret = obj(*args, **kwargs) [ 2024.467391] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 2024.467391] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] return self.do_request(method, action, body=body, [ 2024.467391] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.467391] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] ret = obj(*args, **kwargs) [ 2024.467391] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 2024.467391] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self._handle_fault_response(status_code, replybody, resp) [ 2024.467391] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 2024.467391] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] raise exception.Unauthorized() [ 2024.467391] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] nova.exception.Unauthorized: Not authorized. [ 2024.467391] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] [ 2024.523575] env[62277]: INFO nova.scheduler.client.report [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Deleted allocations for instance 13959890-87a1-45ba-98de-621373e265e7 [ 2024.544895] env[62277]: DEBUG oslo_concurrency.lockutils [None req-db310efa-859c-46da-918f-a999e7db2046 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Lock "13959890-87a1-45ba-98de-621373e265e7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 618.368s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2024.546062] env[62277]: DEBUG oslo_concurrency.lockutils [None req-56856b71-e730-44f7-b78d-e5970a8d3f03 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Lock "13959890-87a1-45ba-98de-621373e265e7" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 422.945s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2024.546501] env[62277]: DEBUG oslo_concurrency.lockutils [None req-56856b71-e730-44f7-b78d-e5970a8d3f03 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Acquiring lock "13959890-87a1-45ba-98de-621373e265e7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2024.546501] env[62277]: DEBUG oslo_concurrency.lockutils [None req-56856b71-e730-44f7-b78d-e5970a8d3f03 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Lock "13959890-87a1-45ba-98de-621373e265e7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2024.546682] env[62277]: DEBUG oslo_concurrency.lockutils [None req-56856b71-e730-44f7-b78d-e5970a8d3f03 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Lock "13959890-87a1-45ba-98de-621373e265e7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2024.548546] env[62277]: INFO nova.compute.manager [None req-56856b71-e730-44f7-b78d-e5970a8d3f03 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Terminating instance [ 2024.550306] env[62277]: DEBUG nova.compute.manager [None req-56856b71-e730-44f7-b78d-e5970a8d3f03 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2024.550716] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-56856b71-e730-44f7-b78d-e5970a8d3f03 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2024.551201] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-eaecdc72-1bef-4de1-a2e7-0f8f1dce583a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2024.558662] env[62277]: DEBUG nova.compute.manager [None req-0493c29c-f7e7-4494-8adf-81be904735b0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 97a1dad9-a665-42ca-b85e-5fef59ab80bf] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2024.563953] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e23e3a8a-a39e-4eb2-9cff-58305fb32c24 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2024.593998] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-56856b71-e730-44f7-b78d-e5970a8d3f03 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 13959890-87a1-45ba-98de-621373e265e7 could not be found. [ 2024.594227] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-56856b71-e730-44f7-b78d-e5970a8d3f03 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2024.594406] env[62277]: INFO nova.compute.manager [None req-56856b71-e730-44f7-b78d-e5970a8d3f03 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2024.594656] env[62277]: DEBUG oslo.service.loopingcall [None req-56856b71-e730-44f7-b78d-e5970a8d3f03 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2024.595071] env[62277]: DEBUG nova.compute.manager [None req-0493c29c-f7e7-4494-8adf-81be904735b0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 97a1dad9-a665-42ca-b85e-5fef59ab80bf] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 2024.596197] env[62277]: DEBUG nova.compute.manager [-] [instance: 13959890-87a1-45ba-98de-621373e265e7] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2024.596300] env[62277]: DEBUG nova.network.neutron [-] [instance: 13959890-87a1-45ba-98de-621373e265e7] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2024.618230] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0493c29c-f7e7-4494-8adf-81be904735b0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Lock "97a1dad9-a665-42ca-b85e-5fef59ab80bf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.553s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2024.628709] env[62277]: DEBUG nova.compute.manager [None req-28c38721-5ec2-400f-8af8-69dde787e0ab tempest-ListImageFiltersTestJSON-1819822959 tempest-ListImageFiltersTestJSON-1819822959-project-member] [instance: 7748e3a1-6adc-4482-90f9-a3816a224272] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2024.679513] env[62277]: DEBUG nova.compute.manager [None req-28c38721-5ec2-400f-8af8-69dde787e0ab tempest-ListImageFiltersTestJSON-1819822959 tempest-ListImageFiltersTestJSON-1819822959-project-member] [instance: 7748e3a1-6adc-4482-90f9-a3816a224272] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 2024.703432] env[62277]: DEBUG oslo_concurrency.lockutils [None req-28c38721-5ec2-400f-8af8-69dde787e0ab tempest-ListImageFiltersTestJSON-1819822959 tempest-ListImageFiltersTestJSON-1819822959-project-member] Lock "7748e3a1-6adc-4482-90f9-a3816a224272" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.886s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2024.716495] env[62277]: DEBUG nova.compute.manager [None req-c7179512-b148-43e2-bbc6-bc1d50376dcf tempest-ListImageFiltersTestJSON-1819822959 tempest-ListImageFiltersTestJSON-1819822959-project-member] [instance: 71f172bf-94bd-4027-bbae-f3bd3e1f91c3] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2024.718094] env[62277]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=62277) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 2024.718094] env[62277]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-6421e994-2b1a-490e-b2b4-0f974f2e03aa'] [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall self._deallocate_network( [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 2024.718555] env[62277]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 2024.719941] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 2024.719941] env[62277]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 2024.719941] env[62277]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2024.719941] env[62277]: ERROR oslo.service.loopingcall [ 2024.719941] env[62277]: ERROR nova.compute.manager [None req-56856b71-e730-44f7-b78d-e5970a8d3f03 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2024.738470] env[62277]: DEBUG nova.compute.manager [None req-c7179512-b148-43e2-bbc6-bc1d50376dcf tempest-ListImageFiltersTestJSON-1819822959 tempest-ListImageFiltersTestJSON-1819822959-project-member] [instance: 71f172bf-94bd-4027-bbae-f3bd3e1f91c3] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 2024.758130] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c7179512-b148-43e2-bbc6-bc1d50376dcf tempest-ListImageFiltersTestJSON-1819822959 tempest-ListImageFiltersTestJSON-1819822959-project-member] Lock "71f172bf-94bd-4027-bbae-f3bd3e1f91c3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.727s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2024.759679] env[62277]: ERROR nova.compute.manager [None req-56856b71-e730-44f7-b78d-e5970a8d3f03 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] Traceback (most recent call last): [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] ret = obj(*args, **kwargs) [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] exception_handler_v20(status_code, error_body) [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] raise client_exc(message=error_message, [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] Neutron server returns request_ids: ['req-6421e994-2b1a-490e-b2b4-0f974f2e03aa'] [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] During handling of the above exception, another exception occurred: [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] Traceback (most recent call last): [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self._delete_instance(context, instance, bdms) [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self._shutdown_instance(context, instance, bdms) [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self._try_deallocate_network(context, instance, requested_networks) [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] with excutils.save_and_reraise_exception(): [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self.force_reraise() [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] raise self.value [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] _deallocate_network_with_retries() [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] return evt.wait() [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2024.759679] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] result = hub.switch() [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] return self.greenlet.switch() [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] result = func(*self.args, **self.kw) [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] result = f(*args, **kwargs) [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self._deallocate_network( [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self.network_api.deallocate_for_instance( [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] data = neutron.list_ports(**search_opts) [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] ret = obj(*args, **kwargs) [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] return self.list('ports', self.ports_path, retrieve_all, [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] ret = obj(*args, **kwargs) [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] for r in self._pagination(collection, path, **params): [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] res = self.get(path, params=params) [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] ret = obj(*args, **kwargs) [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] return self.retry_request("GET", action, body=body, [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] ret = obj(*args, **kwargs) [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] return self.do_request(method, action, body=body, [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] ret = obj(*args, **kwargs) [ 2024.760815] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 2024.761921] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] self._handle_fault_response(status_code, replybody, resp) [ 2024.761921] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 2024.761921] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 2024.761921] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2024.761921] env[62277]: ERROR nova.compute.manager [instance: 13959890-87a1-45ba-98de-621373e265e7] [ 2024.769714] env[62277]: DEBUG nova.compute.manager [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2024.795601] env[62277]: DEBUG oslo_concurrency.lockutils [None req-56856b71-e730-44f7-b78d-e5970a8d3f03 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Lock "13959890-87a1-45ba-98de-621373e265e7" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.249s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2024.796903] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "13959890-87a1-45ba-98de-621373e265e7" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 154.671s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2024.797100] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 13959890-87a1-45ba-98de-621373e265e7] During sync_power_state the instance has a pending task (deleting). Skip. [ 2024.797273] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "13959890-87a1-45ba-98de-621373e265e7" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2024.836958] env[62277]: DEBUG oslo_concurrency.lockutils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2024.837225] env[62277]: DEBUG oslo_concurrency.lockutils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2024.839497] env[62277]: INFO nova.compute.claims [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2024.847703] env[62277]: INFO nova.compute.manager [None req-56856b71-e730-44f7-b78d-e5970a8d3f03 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] [instance: 13959890-87a1-45ba-98de-621373e265e7] Successfully reverted task state from None on failure for instance. [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server [None req-56856b71-e730-44f7-b78d-e5970a8d3f03 tempest-ServersAdminNegativeTestJSON-1613847164 tempest-ServersAdminNegativeTestJSON-1613847164-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-6421e994-2b1a-490e-b2b4-0f974f2e03aa'] [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2024.851457] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server return evt.wait() [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 2024.852754] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2024.854350] env[62277]: ERROR oslo_messaging.rpc.server [ 2025.046609] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f3b91c2-d9de-4b79-9795-9fab3701aad3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2025.054067] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-796948b8-c620-45ed-b7f8-ac6d864ad92b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2025.084300] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d579a4b-e13b-4087-9f00-6d1ba1a0ecbd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2025.091461] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62278544-c3cd-479a-8741-aedffe95a1fc {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2025.104639] env[62277]: DEBUG nova.compute.provider_tree [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2025.113557] env[62277]: DEBUG nova.scheduler.client.report [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2025.129249] env[62277]: DEBUG oslo_concurrency.lockutils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2025.129684] env[62277]: DEBUG nova.compute.manager [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2025.160019] env[62277]: DEBUG nova.compute.utils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2025.161316] env[62277]: DEBUG nova.compute.manager [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2025.161485] env[62277]: DEBUG nova.network.neutron [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2025.171162] env[62277]: DEBUG nova.compute.manager [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2025.219470] env[62277]: DEBUG nova.policy [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4600f6c9a0554b8a8077a3977337bfde', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bd7e0cacdaeb4e6e80d603d41978a23f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 2025.233282] env[62277]: DEBUG nova.compute.manager [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2025.258015] env[62277]: DEBUG nova.virt.hardware [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2025.258270] env[62277]: DEBUG nova.virt.hardware [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2025.258423] env[62277]: DEBUG nova.virt.hardware [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2025.258600] env[62277]: DEBUG nova.virt.hardware [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2025.258742] env[62277]: DEBUG nova.virt.hardware [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2025.258886] env[62277]: DEBUG nova.virt.hardware [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2025.259200] env[62277]: DEBUG nova.virt.hardware [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2025.259463] env[62277]: DEBUG nova.virt.hardware [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2025.259758] env[62277]: DEBUG nova.virt.hardware [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2025.260123] env[62277]: DEBUG nova.virt.hardware [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2025.260356] env[62277]: DEBUG nova.virt.hardware [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2025.261208] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b3670e4-6aaa-4611-9187-52224c5523c1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2025.269265] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3a6c1b9-10d1-4abd-a90d-b6883ec81c10 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2025.548843] env[62277]: DEBUG nova.network.neutron [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Successfully created port: d8c37e2c-9810-47a8-a823-d1d94ed2561c {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2026.165149] env[62277]: DEBUG nova.compute.manager [req-2043aa8b-51e1-4eb5-88e6-f1727845d395 req-a327b39d-90e4-4832-8551-766e35909847 service nova] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Received event network-vif-plugged-d8c37e2c-9810-47a8-a823-d1d94ed2561c {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2026.166011] env[62277]: DEBUG oslo_concurrency.lockutils [req-2043aa8b-51e1-4eb5-88e6-f1727845d395 req-a327b39d-90e4-4832-8551-766e35909847 service nova] Acquiring lock "ad83bf06-d712-4bd4-8086-9c3b615adaf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2026.166011] env[62277]: DEBUG oslo_concurrency.lockutils [req-2043aa8b-51e1-4eb5-88e6-f1727845d395 req-a327b39d-90e4-4832-8551-766e35909847 service nova] Lock "ad83bf06-d712-4bd4-8086-9c3b615adaf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2026.166011] env[62277]: DEBUG oslo_concurrency.lockutils [req-2043aa8b-51e1-4eb5-88e6-f1727845d395 req-a327b39d-90e4-4832-8551-766e35909847 service nova] Lock "ad83bf06-d712-4bd4-8086-9c3b615adaf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2026.166011] env[62277]: DEBUG nova.compute.manager [req-2043aa8b-51e1-4eb5-88e6-f1727845d395 req-a327b39d-90e4-4832-8551-766e35909847 service nova] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] No waiting events found dispatching network-vif-plugged-d8c37e2c-9810-47a8-a823-d1d94ed2561c {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2026.166321] env[62277]: WARNING nova.compute.manager [req-2043aa8b-51e1-4eb5-88e6-f1727845d395 req-a327b39d-90e4-4832-8551-766e35909847 service nova] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Received unexpected event network-vif-plugged-d8c37e2c-9810-47a8-a823-d1d94ed2561c for instance with vm_state building and task_state spawning. [ 2026.250118] env[62277]: DEBUG nova.network.neutron [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Successfully updated port: d8c37e2c-9810-47a8-a823-d1d94ed2561c {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2026.265538] env[62277]: DEBUG oslo_concurrency.lockutils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquiring lock "refresh_cache-ad83bf06-d712-4bd4-8086-9c3b615adaf5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2026.265808] env[62277]: DEBUG oslo_concurrency.lockutils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquired lock "refresh_cache-ad83bf06-d712-4bd4-8086-9c3b615adaf5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2026.266084] env[62277]: DEBUG nova.network.neutron [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2026.301558] env[62277]: DEBUG nova.network.neutron [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2026.471856] env[62277]: DEBUG nova.network.neutron [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Updating instance_info_cache with network_info: [{"id": "d8c37e2c-9810-47a8-a823-d1d94ed2561c", "address": "fa:16:3e:c4:ed:a1", "network": {"id": "f0d2c639-a921-4764-8b8a-3590b3b7d7f3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1936428655-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bd7e0cacdaeb4e6e80d603d41978a23f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b7bf7d4-8e0c-4cee-84ba-244e73ef6379", "external-id": "nsx-vlan-transportzone-423", "segmentation_id": 423, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd8c37e2c-98", "ovs_interfaceid": "d8c37e2c-9810-47a8-a823-d1d94ed2561c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2026.485496] env[62277]: DEBUG oslo_concurrency.lockutils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Releasing lock "refresh_cache-ad83bf06-d712-4bd4-8086-9c3b615adaf5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2026.485843] env[62277]: DEBUG nova.compute.manager [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Instance network_info: |[{"id": "d8c37e2c-9810-47a8-a823-d1d94ed2561c", "address": "fa:16:3e:c4:ed:a1", "network": {"id": "f0d2c639-a921-4764-8b8a-3590b3b7d7f3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1936428655-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bd7e0cacdaeb4e6e80d603d41978a23f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b7bf7d4-8e0c-4cee-84ba-244e73ef6379", "external-id": "nsx-vlan-transportzone-423", "segmentation_id": 423, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd8c37e2c-98", "ovs_interfaceid": "d8c37e2c-9810-47a8-a823-d1d94ed2561c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2026.486291] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c4:ed:a1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3b7bf7d4-8e0c-4cee-84ba-244e73ef6379', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd8c37e2c-9810-47a8-a823-d1d94ed2561c', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2026.494283] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Creating folder: Project (bd7e0cacdaeb4e6e80d603d41978a23f). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2026.494818] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3528724d-7206-42ee-8b55-f78827da0f18 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2026.505841] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Created folder: Project (bd7e0cacdaeb4e6e80d603d41978a23f) in parent group-v297781. [ 2026.506030] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Creating folder: Instances. Parent ref: group-v297883. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2026.506256] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e3f4c513-9990-456b-bb59-7df7dd203eea {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2026.515471] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Created folder: Instances in parent group-v297883. [ 2026.515694] env[62277]: DEBUG oslo.service.loopingcall [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2026.515869] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2026.516074] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2f5943d4-a997-4a2b-baeb-c8b0875d3bb8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2026.534860] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2026.534860] env[62277]: value = "task-1405482" [ 2026.534860] env[62277]: _type = "Task" [ 2026.534860] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2026.541849] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405482, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2027.044488] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405482, 'name': CreateVM_Task, 'duration_secs': 0.309864} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2027.044655] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2027.045772] env[62277]: DEBUG oslo_concurrency.lockutils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2027.045772] env[62277]: DEBUG oslo_concurrency.lockutils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2027.045772] env[62277]: DEBUG oslo_concurrency.lockutils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2027.045970] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3924634f-b0eb-4456-95a7-d175fea803d8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2027.050466] env[62277]: DEBUG oslo_vmware.api [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Waiting for the task: (returnval){ [ 2027.050466] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52e579f2-e5c6-9bd3-cd6e-d9369244e66f" [ 2027.050466] env[62277]: _type = "Task" [ 2027.050466] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2027.058569] env[62277]: DEBUG oslo_vmware.api [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52e579f2-e5c6-9bd3-cd6e-d9369244e66f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2027.562199] env[62277]: DEBUG oslo_concurrency.lockutils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2027.562528] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2027.562528] env[62277]: DEBUG oslo_concurrency.lockutils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2028.196477] env[62277]: DEBUG nova.compute.manager [req-679f0f59-2503-4dee-88ba-bd6a3c6e938c req-2ed38dad-43f1-4b0e-89f5-ff276756ece7 service nova] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Received event network-changed-d8c37e2c-9810-47a8-a823-d1d94ed2561c {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2028.196671] env[62277]: DEBUG nova.compute.manager [req-679f0f59-2503-4dee-88ba-bd6a3c6e938c req-2ed38dad-43f1-4b0e-89f5-ff276756ece7 service nova] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Refreshing instance network info cache due to event network-changed-d8c37e2c-9810-47a8-a823-d1d94ed2561c. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 2028.196880] env[62277]: DEBUG oslo_concurrency.lockutils [req-679f0f59-2503-4dee-88ba-bd6a3c6e938c req-2ed38dad-43f1-4b0e-89f5-ff276756ece7 service nova] Acquiring lock "refresh_cache-ad83bf06-d712-4bd4-8086-9c3b615adaf5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2028.197030] env[62277]: DEBUG oslo_concurrency.lockutils [req-679f0f59-2503-4dee-88ba-bd6a3c6e938c req-2ed38dad-43f1-4b0e-89f5-ff276756ece7 service nova] Acquired lock "refresh_cache-ad83bf06-d712-4bd4-8086-9c3b615adaf5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2028.197190] env[62277]: DEBUG nova.network.neutron [req-679f0f59-2503-4dee-88ba-bd6a3c6e938c req-2ed38dad-43f1-4b0e-89f5-ff276756ece7 service nova] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Refreshing network info cache for port d8c37e2c-9810-47a8-a823-d1d94ed2561c {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2028.470889] env[62277]: DEBUG nova.network.neutron [req-679f0f59-2503-4dee-88ba-bd6a3c6e938c req-2ed38dad-43f1-4b0e-89f5-ff276756ece7 service nova] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Updated VIF entry in instance network info cache for port d8c37e2c-9810-47a8-a823-d1d94ed2561c. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2028.471299] env[62277]: DEBUG nova.network.neutron [req-679f0f59-2503-4dee-88ba-bd6a3c6e938c req-2ed38dad-43f1-4b0e-89f5-ff276756ece7 service nova] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Updating instance_info_cache with network_info: [{"id": "d8c37e2c-9810-47a8-a823-d1d94ed2561c", "address": "fa:16:3e:c4:ed:a1", "network": {"id": "f0d2c639-a921-4764-8b8a-3590b3b7d7f3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1936428655-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bd7e0cacdaeb4e6e80d603d41978a23f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b7bf7d4-8e0c-4cee-84ba-244e73ef6379", "external-id": "nsx-vlan-transportzone-423", "segmentation_id": 423, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd8c37e2c-98", "ovs_interfaceid": "d8c37e2c-9810-47a8-a823-d1d94ed2561c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2028.481102] env[62277]: DEBUG oslo_concurrency.lockutils [req-679f0f59-2503-4dee-88ba-bd6a3c6e938c req-2ed38dad-43f1-4b0e-89f5-ff276756ece7 service nova] Releasing lock "refresh_cache-ad83bf06-d712-4bd4-8086-9c3b615adaf5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2036.267976] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7028cb85-dee9-4ac8-aa89-05c1fceb35f7 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquiring lock "ad83bf06-d712-4bd4-8086-9c3b615adaf5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2045.486269] env[62277]: DEBUG oslo_concurrency.lockutils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "6737c3b9-d9e6-4879-a6df-46d3c7dee40e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2045.486564] env[62277]: DEBUG oslo_concurrency.lockutils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "6737c3b9-d9e6-4879-a6df-46d3c7dee40e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2053.924418] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d985159-9f30-4671-8956-f52f4b1e3bec tempest-AttachVolumeNegativeTest-184472694 tempest-AttachVolumeNegativeTest-184472694-project-member] Acquiring lock "40a309bf-6b7f-4360-a083-640db68bb00b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2053.924752] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d985159-9f30-4671-8956-f52f4b1e3bec tempest-AttachVolumeNegativeTest-184472694 tempest-AttachVolumeNegativeTest-184472694-project-member] Lock "40a309bf-6b7f-4360-a083-640db68bb00b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2068.169272] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2069.955209] env[62277]: WARNING oslo_vmware.rw_handles [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2069.955209] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2069.955209] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2069.955209] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2069.955209] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2069.955209] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 2069.955209] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2069.955209] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2069.955209] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2069.955209] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2069.955209] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2069.955209] env[62277]: ERROR oslo_vmware.rw_handles [ 2069.955209] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/034a623b-5839-466c-8b51-16a909e69992/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2069.957203] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2069.957440] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Copying Virtual Disk [datastore2] vmware_temp/034a623b-5839-466c-8b51-16a909e69992/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/034a623b-5839-466c-8b51-16a909e69992/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2069.957731] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6652f917-8162-4080-bc8d-7c21e183522e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2069.966064] env[62277]: DEBUG oslo_vmware.api [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Waiting for the task: (returnval){ [ 2069.966064] env[62277]: value = "task-1405483" [ 2069.966064] env[62277]: _type = "Task" [ 2069.966064] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2069.974093] env[62277]: DEBUG oslo_vmware.api [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Task: {'id': task-1405483, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2070.476147] env[62277]: DEBUG oslo_vmware.exceptions [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2070.476414] env[62277]: DEBUG oslo_concurrency.lockutils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2070.476945] env[62277]: ERROR nova.compute.manager [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2070.476945] env[62277]: Faults: ['InvalidArgument'] [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Traceback (most recent call last): [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] yield resources [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] self.driver.spawn(context, instance, image_meta, [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] self._fetch_image_if_missing(context, vi) [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] image_cache(vi, tmp_image_ds_loc) [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] vm_util.copy_virtual_disk( [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] session._wait_for_task(vmdk_copy_task) [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] return self.wait_for_task(task_ref) [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] return evt.wait() [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] result = hub.switch() [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] return self.greenlet.switch() [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] self.f(*self.args, **self.kw) [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] raise exceptions.translate_fault(task_info.error) [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Faults: ['InvalidArgument'] [ 2070.476945] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] [ 2070.477994] env[62277]: INFO nova.compute.manager [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Terminating instance [ 2070.479029] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2070.479029] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2070.479225] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-785345c2-ff8b-4ff4-bb5c-b633f738f245 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2070.481464] env[62277]: DEBUG nova.compute.manager [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2070.481646] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2070.482357] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9a73a48-41c1-49e5-91b9-c0719ae528bf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2070.489167] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2070.490098] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ea4b1796-8e85-49a1-9835-37329699d374 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2070.491421] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2070.491593] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2070.492246] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-43735fe0-3546-416d-bbd7-dd95d0002802 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2070.497409] env[62277]: DEBUG oslo_vmware.api [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for the task: (returnval){ [ 2070.497409] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52d393c1-d658-6369-bc29-8be2ec22dce8" [ 2070.497409] env[62277]: _type = "Task" [ 2070.497409] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2070.505545] env[62277]: DEBUG oslo_vmware.api [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52d393c1-d658-6369-bc29-8be2ec22dce8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2070.569214] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2070.569448] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2070.569592] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Deleting the datastore file [datastore2] a7cc7e45-8567-4699-af83-624b1c7c5c64 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2070.569880] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3438ebc8-9d1f-4e8a-9d59-d13687df1f7d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2070.575903] env[62277]: DEBUG oslo_vmware.api [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Waiting for the task: (returnval){ [ 2070.575903] env[62277]: value = "task-1405485" [ 2070.575903] env[62277]: _type = "Task" [ 2070.575903] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2070.583133] env[62277]: DEBUG oslo_vmware.api [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Task: {'id': task-1405485, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2071.008448] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2071.008713] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Creating directory with path [datastore2] vmware_temp/9965fd45-cc0a-4b4d-8bc4-7aab2ef65335/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2071.008938] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c209aded-264c-4825-80cd-74c7d0c13243 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.020379] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Created directory with path [datastore2] vmware_temp/9965fd45-cc0a-4b4d-8bc4-7aab2ef65335/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2071.020572] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Fetch image to [datastore2] vmware_temp/9965fd45-cc0a-4b4d-8bc4-7aab2ef65335/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2071.020739] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/9965fd45-cc0a-4b4d-8bc4-7aab2ef65335/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2071.021507] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d5873dd-8fab-467b-a088-3f7ed903119f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.028053] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-063244d6-f09c-416a-857f-9da5e008b600 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.037230] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a42a8e07-8116-44ae-9991-a286f91559bd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.068892] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-181b9537-582d-4ca5-b521-8f492a54b89e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.074929] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1349328e-c5a7-40e3-906c-56953cc82771 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.084305] env[62277]: DEBUG oslo_vmware.api [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Task: {'id': task-1405485, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067496} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2071.084538] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2071.084725] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2071.084920] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2071.085107] env[62277]: INFO nova.compute.manager [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2071.087186] env[62277]: DEBUG nova.compute.claims [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2071.087354] env[62277]: DEBUG oslo_concurrency.lockutils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2071.087566] env[62277]: DEBUG oslo_concurrency.lockutils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2071.098609] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2071.153290] env[62277]: DEBUG oslo_vmware.rw_handles [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9965fd45-cc0a-4b4d-8bc4-7aab2ef65335/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2071.212641] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2071.215539] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2071.217794] env[62277]: DEBUG oslo_vmware.rw_handles [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2071.217964] env[62277]: DEBUG oslo_vmware.rw_handles [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9965fd45-cc0a-4b4d-8bc4-7aab2ef65335/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2071.352050] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74f313ef-705b-4334-a675-27c9e5b08b0d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.358775] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3402fdc0-9853-4a0c-859a-476e85db5e49 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.387088] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f931325d-03e0-4685-9d7c-e2348d0ac3c5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.393842] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e27bca1-c4fe-472e-9452-229475240417 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2071.407155] env[62277]: DEBUG nova.compute.provider_tree [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2071.415580] env[62277]: DEBUG nova.scheduler.client.report [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2071.431639] env[62277]: DEBUG oslo_concurrency.lockutils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.344s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2071.432138] env[62277]: ERROR nova.compute.manager [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2071.432138] env[62277]: Faults: ['InvalidArgument'] [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Traceback (most recent call last): [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] self.driver.spawn(context, instance, image_meta, [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] self._fetch_image_if_missing(context, vi) [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] image_cache(vi, tmp_image_ds_loc) [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] vm_util.copy_virtual_disk( [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] session._wait_for_task(vmdk_copy_task) [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] return self.wait_for_task(task_ref) [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] return evt.wait() [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] result = hub.switch() [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] return self.greenlet.switch() [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] self.f(*self.args, **self.kw) [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] raise exceptions.translate_fault(task_info.error) [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Faults: ['InvalidArgument'] [ 2071.432138] env[62277]: ERROR nova.compute.manager [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] [ 2071.433237] env[62277]: DEBUG nova.compute.utils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2071.434207] env[62277]: DEBUG nova.compute.manager [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Build of instance a7cc7e45-8567-4699-af83-624b1c7c5c64 was re-scheduled: A specified parameter was not correct: fileType [ 2071.434207] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2071.434558] env[62277]: DEBUG nova.compute.manager [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2071.434724] env[62277]: DEBUG nova.compute.manager [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2071.434887] env[62277]: DEBUG nova.compute.manager [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2071.435064] env[62277]: DEBUG nova.network.neutron [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2071.895261] env[62277]: DEBUG nova.network.neutron [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2071.905449] env[62277]: INFO nova.compute.manager [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Took 0.47 seconds to deallocate network for instance. [ 2071.997570] env[62277]: INFO nova.scheduler.client.report [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Deleted allocations for instance a7cc7e45-8567-4699-af83-624b1c7c5c64 [ 2072.016831] env[62277]: DEBUG oslo_concurrency.lockutils [None req-87e2a13a-490a-4e4c-8f9f-6700caaff99b tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Lock "a7cc7e45-8567-4699-af83-624b1c7c5c64" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 629.082s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2072.018240] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e37a08e3-976c-45f1-b47f-15fef7ecad2e tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Lock "a7cc7e45-8567-4699-af83-624b1c7c5c64" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 433.274s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2072.018464] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e37a08e3-976c-45f1-b47f-15fef7ecad2e tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Acquiring lock "a7cc7e45-8567-4699-af83-624b1c7c5c64-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2072.018668] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e37a08e3-976c-45f1-b47f-15fef7ecad2e tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Lock "a7cc7e45-8567-4699-af83-624b1c7c5c64-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2072.018828] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e37a08e3-976c-45f1-b47f-15fef7ecad2e tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Lock "a7cc7e45-8567-4699-af83-624b1c7c5c64-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2072.020638] env[62277]: INFO nova.compute.manager [None req-e37a08e3-976c-45f1-b47f-15fef7ecad2e tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Terminating instance [ 2072.022624] env[62277]: DEBUG nova.compute.manager [None req-e37a08e3-976c-45f1-b47f-15fef7ecad2e tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2072.022959] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-e37a08e3-976c-45f1-b47f-15fef7ecad2e tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2072.023449] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7b43970c-4180-4179-8a55-3db91d161ea8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2072.029177] env[62277]: DEBUG nova.compute.manager [None req-79b9b1c6-a222-4945-bee1-3943b02a2efc tempest-AttachVolumeNegativeTest-184472694 tempest-AttachVolumeNegativeTest-184472694-project-member] [instance: e0c4112c-6bc6-44e4-8e43-fda8203bf1c3] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2072.035718] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f009107-3694-4a6c-a891-ea557817eef6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2072.052718] env[62277]: DEBUG nova.compute.manager [None req-79b9b1c6-a222-4945-bee1-3943b02a2efc tempest-AttachVolumeNegativeTest-184472694 tempest-AttachVolumeNegativeTest-184472694-project-member] [instance: e0c4112c-6bc6-44e4-8e43-fda8203bf1c3] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 2072.066100] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-e37a08e3-976c-45f1-b47f-15fef7ecad2e tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a7cc7e45-8567-4699-af83-624b1c7c5c64 could not be found. [ 2072.066360] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-e37a08e3-976c-45f1-b47f-15fef7ecad2e tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2072.066501] env[62277]: INFO nova.compute.manager [None req-e37a08e3-976c-45f1-b47f-15fef7ecad2e tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2072.066721] env[62277]: DEBUG oslo.service.loopingcall [None req-e37a08e3-976c-45f1-b47f-15fef7ecad2e tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2072.066943] env[62277]: DEBUG nova.compute.manager [-] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2072.067043] env[62277]: DEBUG nova.network.neutron [-] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2072.086770] env[62277]: DEBUG oslo_concurrency.lockutils [None req-79b9b1c6-a222-4945-bee1-3943b02a2efc tempest-AttachVolumeNegativeTest-184472694 tempest-AttachVolumeNegativeTest-184472694-project-member] Lock "e0c4112c-6bc6-44e4-8e43-fda8203bf1c3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.398s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2072.091531] env[62277]: DEBUG nova.network.neutron [-] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2072.095926] env[62277]: DEBUG nova.compute.manager [None req-0afa8b9c-dcfc-44a8-a6b0-bd433c5ab7c1 tempest-ServersNegativeTestJSON-1317204946 tempest-ServersNegativeTestJSON-1317204946-project-member] [instance: 81c79e22-aaa3-45dc-967a-b4a884f692eb] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2072.100047] env[62277]: INFO nova.compute.manager [-] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] Took 0.03 seconds to deallocate network for instance. [ 2072.124260] env[62277]: DEBUG nova.compute.manager [None req-0afa8b9c-dcfc-44a8-a6b0-bd433c5ab7c1 tempest-ServersNegativeTestJSON-1317204946 tempest-ServersNegativeTestJSON-1317204946-project-member] [instance: 81c79e22-aaa3-45dc-967a-b4a884f692eb] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 2072.146226] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0afa8b9c-dcfc-44a8-a6b0-bd433c5ab7c1 tempest-ServersNegativeTestJSON-1317204946 tempest-ServersNegativeTestJSON-1317204946-project-member] Lock "81c79e22-aaa3-45dc-967a-b4a884f692eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.057s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2072.161899] env[62277]: DEBUG nova.compute.manager [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2072.167464] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2072.167609] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2072.167752] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 2072.167874] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 2072.186553] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2072.186763] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2072.186926] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2072.187093] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2072.187250] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2072.187400] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2072.187548] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2072.187749] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2072.187910] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2072.188072] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 2072.188621] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2072.228696] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2072.229014] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2072.229116] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2072.229261] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2072.230396] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4783fe28-af8d-4088-b7e9-0ed8e54194c1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2072.246051] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c95760d-b889-4931-a8a9-8e38d587b13d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2072.252935] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2072.253183] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2072.254613] env[62277]: INFO nova.compute.claims [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2072.265807] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5db5e0d-430d-4d83-b3fd-11f07b5b2868 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2072.270150] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e37a08e3-976c-45f1-b47f-15fef7ecad2e tempest-ServerRescueTestJSONUnderV235-996828968 tempest-ServerRescueTestJSONUnderV235-996828968-project-member] Lock "a7cc7e45-8567-4699-af83-624b1c7c5c64" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.252s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2072.270969] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "a7cc7e45-8567-4699-af83-624b1c7c5c64" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 202.144s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2072.271164] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: a7cc7e45-8567-4699-af83-624b1c7c5c64] During sync_power_state the instance has a pending task (deleting). Skip. [ 2072.271360] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "a7cc7e45-8567-4699-af83-624b1c7c5c64" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2072.274124] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4d87b40-99b7-4261-b6b1-f95918caf4bc {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2072.304235] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181422MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2072.304386] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2072.462378] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9de998e9-6dcf-4bfd-90a0-7ad76b7f96d9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2072.469608] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c585e02c-7b47-4b2f-9f2b-79b9d9337dbc {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2072.499472] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d246439-5a13-412d-b13f-306c58bcc7ec {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2072.506142] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49d64e10-f4ea-4382-ac5e-a0a9539b6033 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2072.518587] env[62277]: DEBUG nova.compute.provider_tree [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2072.526455] env[62277]: DEBUG nova.scheduler.client.report [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2072.539106] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.286s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2072.539541] env[62277]: DEBUG nova.compute.manager [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2072.541633] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.237s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2072.570819] env[62277]: DEBUG nova.compute.utils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2072.572534] env[62277]: DEBUG nova.compute.manager [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2072.572868] env[62277]: DEBUG nova.network.neutron [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2072.585058] env[62277]: DEBUG nova.compute.manager [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2072.611307] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 42005809-1926-44b2-8ef6-3b6cb28a4020 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2072.611570] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2072.611650] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2072.611785] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 163eb4e7-33f8-4674-8a3f-5094356e250d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2072.611902] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 400beb27-a709-4ef4-851e-5caaab9ca60b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2072.612037] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9f7d4431-d5ea-4f9b-888b-77a6a7772047 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2072.612152] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8b9ef530-e79f-4cd4-8a88-83871ed65f90 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2072.612394] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c4c22c8a-4300-45ce-8484-77c638f7bbc5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2072.612394] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance ad83bf06-d712-4bd4-8086-9c3b615adaf5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2072.612518] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance de543a46-26c3-40b3-9ccd-80bb1f9845d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2072.630746] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 297d53df-7918-4389-9c63-a600755da969 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2072.639208] env[62277]: DEBUG nova.policy [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '013359a6ab0644799bb338125a970c37', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '47f21dc2b2ad4fe692324779a4a84760', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 2072.641480] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 940561d5-723b-4e43-8fab-35e8af95ce09 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2072.657951] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6737c3b9-d9e6-4879-a6df-46d3c7dee40e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2072.661959] env[62277]: DEBUG nova.compute.manager [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2072.667852] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 40a309bf-6b7f-4360-a083-640db68bb00b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2072.668187] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2072.668393] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2072.688878] env[62277]: DEBUG nova.virt.hardware [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2072.689120] env[62277]: DEBUG nova.virt.hardware [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2072.689275] env[62277]: DEBUG nova.virt.hardware [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2072.689456] env[62277]: DEBUG nova.virt.hardware [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2072.689601] env[62277]: DEBUG nova.virt.hardware [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2072.689743] env[62277]: DEBUG nova.virt.hardware [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2072.689940] env[62277]: DEBUG nova.virt.hardware [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2072.690113] env[62277]: DEBUG nova.virt.hardware [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2072.690281] env[62277]: DEBUG nova.virt.hardware [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2072.690467] env[62277]: DEBUG nova.virt.hardware [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2072.690644] env[62277]: DEBUG nova.virt.hardware [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2072.691499] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81df3cbb-21ea-43f7-a522-4125269bb4db {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2072.700998] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb9d6a6a-6832-451a-b887-41c6d1209cbd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2072.839583] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-841ad27a-ad22-46d0-80e4-234b61cc4387 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2072.846721] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4efe55ee-a897-48c1-88d8-40ab6657d392 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2072.876758] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3835174-e082-4c94-8f2a-4f2fe58c1bae {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2072.884016] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b02b33dc-1180-4bcd-a772-3f36005b18da {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2072.897207] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2072.905042] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2072.919970] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2072.920166] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.379s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2073.007840] env[62277]: DEBUG nova.network.neutron [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Successfully created port: 2413e493-0b90-41ea-8d37-f162f32e6c62 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2073.745205] env[62277]: DEBUG nova.network.neutron [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Successfully updated port: 2413e493-0b90-41ea-8d37-f162f32e6c62 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2073.754980] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "refresh_cache-de543a46-26c3-40b3-9ccd-80bb1f9845d7" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2073.755217] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquired lock "refresh_cache-de543a46-26c3-40b3-9ccd-80bb1f9845d7" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2073.755417] env[62277]: DEBUG nova.network.neutron [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2073.795337] env[62277]: DEBUG nova.network.neutron [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2074.078133] env[62277]: DEBUG nova.compute.manager [req-2602e696-4f60-4638-9e69-bf8989c4f74e req-832b5617-4b7f-4d92-9720-98798534b049 service nova] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Received event network-vif-plugged-2413e493-0b90-41ea-8d37-f162f32e6c62 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2074.078369] env[62277]: DEBUG oslo_concurrency.lockutils [req-2602e696-4f60-4638-9e69-bf8989c4f74e req-832b5617-4b7f-4d92-9720-98798534b049 service nova] Acquiring lock "de543a46-26c3-40b3-9ccd-80bb1f9845d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2074.078569] env[62277]: DEBUG oslo_concurrency.lockutils [req-2602e696-4f60-4638-9e69-bf8989c4f74e req-832b5617-4b7f-4d92-9720-98798534b049 service nova] Lock "de543a46-26c3-40b3-9ccd-80bb1f9845d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2074.078732] env[62277]: DEBUG oslo_concurrency.lockutils [req-2602e696-4f60-4638-9e69-bf8989c4f74e req-832b5617-4b7f-4d92-9720-98798534b049 service nova] Lock "de543a46-26c3-40b3-9ccd-80bb1f9845d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2074.078892] env[62277]: DEBUG nova.compute.manager [req-2602e696-4f60-4638-9e69-bf8989c4f74e req-832b5617-4b7f-4d92-9720-98798534b049 service nova] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] No waiting events found dispatching network-vif-plugged-2413e493-0b90-41ea-8d37-f162f32e6c62 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2074.079062] env[62277]: WARNING nova.compute.manager [req-2602e696-4f60-4638-9e69-bf8989c4f74e req-832b5617-4b7f-4d92-9720-98798534b049 service nova] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Received unexpected event network-vif-plugged-2413e493-0b90-41ea-8d37-f162f32e6c62 for instance with vm_state building and task_state spawning. [ 2074.079229] env[62277]: DEBUG nova.compute.manager [req-2602e696-4f60-4638-9e69-bf8989c4f74e req-832b5617-4b7f-4d92-9720-98798534b049 service nova] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Received event network-changed-2413e493-0b90-41ea-8d37-f162f32e6c62 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2074.079373] env[62277]: DEBUG nova.compute.manager [req-2602e696-4f60-4638-9e69-bf8989c4f74e req-832b5617-4b7f-4d92-9720-98798534b049 service nova] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Refreshing instance network info cache due to event network-changed-2413e493-0b90-41ea-8d37-f162f32e6c62. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 2074.079535] env[62277]: DEBUG oslo_concurrency.lockutils [req-2602e696-4f60-4638-9e69-bf8989c4f74e req-832b5617-4b7f-4d92-9720-98798534b049 service nova] Acquiring lock "refresh_cache-de543a46-26c3-40b3-9ccd-80bb1f9845d7" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2074.186902] env[62277]: DEBUG nova.network.neutron [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Updating instance_info_cache with network_info: [{"id": "2413e493-0b90-41ea-8d37-f162f32e6c62", "address": "fa:16:3e:e2:08:4e", "network": {"id": "0fdd7696-a8f7-4fbe-88a6-4c5254049911", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-706519722-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "47f21dc2b2ad4fe692324779a4a84760", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7150f662-0cf1-44f9-ae14-d70f479649b6", "external-id": "nsx-vlan-transportzone-712", "segmentation_id": 712, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2413e493-0b", "ovs_interfaceid": "2413e493-0b90-41ea-8d37-f162f32e6c62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2074.200954] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Releasing lock "refresh_cache-de543a46-26c3-40b3-9ccd-80bb1f9845d7" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2074.201271] env[62277]: DEBUG nova.compute.manager [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Instance network_info: |[{"id": "2413e493-0b90-41ea-8d37-f162f32e6c62", "address": "fa:16:3e:e2:08:4e", "network": {"id": "0fdd7696-a8f7-4fbe-88a6-4c5254049911", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-706519722-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "47f21dc2b2ad4fe692324779a4a84760", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7150f662-0cf1-44f9-ae14-d70f479649b6", "external-id": "nsx-vlan-transportzone-712", "segmentation_id": 712, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2413e493-0b", "ovs_interfaceid": "2413e493-0b90-41ea-8d37-f162f32e6c62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2074.201678] env[62277]: DEBUG oslo_concurrency.lockutils [req-2602e696-4f60-4638-9e69-bf8989c4f74e req-832b5617-4b7f-4d92-9720-98798534b049 service nova] Acquired lock "refresh_cache-de543a46-26c3-40b3-9ccd-80bb1f9845d7" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2074.201841] env[62277]: DEBUG nova.network.neutron [req-2602e696-4f60-4638-9e69-bf8989c4f74e req-832b5617-4b7f-4d92-9720-98798534b049 service nova] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Refreshing network info cache for port 2413e493-0b90-41ea-8d37-f162f32e6c62 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2074.203571] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e2:08:4e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7150f662-0cf1-44f9-ae14-d70f479649b6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2413e493-0b90-41ea-8d37-f162f32e6c62', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2074.210218] env[62277]: DEBUG oslo.service.loopingcall [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2074.213291] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2074.213731] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2dcc8de6-262c-4dd5-97f2-b37e817776d2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2074.234073] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2074.234073] env[62277]: value = "task-1405486" [ 2074.234073] env[62277]: _type = "Task" [ 2074.234073] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2074.242366] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405486, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2074.545494] env[62277]: DEBUG nova.network.neutron [req-2602e696-4f60-4638-9e69-bf8989c4f74e req-832b5617-4b7f-4d92-9720-98798534b049 service nova] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Updated VIF entry in instance network info cache for port 2413e493-0b90-41ea-8d37-f162f32e6c62. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2074.545857] env[62277]: DEBUG nova.network.neutron [req-2602e696-4f60-4638-9e69-bf8989c4f74e req-832b5617-4b7f-4d92-9720-98798534b049 service nova] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Updating instance_info_cache with network_info: [{"id": "2413e493-0b90-41ea-8d37-f162f32e6c62", "address": "fa:16:3e:e2:08:4e", "network": {"id": "0fdd7696-a8f7-4fbe-88a6-4c5254049911", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-706519722-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "47f21dc2b2ad4fe692324779a4a84760", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7150f662-0cf1-44f9-ae14-d70f479649b6", "external-id": "nsx-vlan-transportzone-712", "segmentation_id": 712, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2413e493-0b", "ovs_interfaceid": "2413e493-0b90-41ea-8d37-f162f32e6c62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2074.554611] env[62277]: DEBUG oslo_concurrency.lockutils [req-2602e696-4f60-4638-9e69-bf8989c4f74e req-832b5617-4b7f-4d92-9720-98798534b049 service nova] Releasing lock "refresh_cache-de543a46-26c3-40b3-9ccd-80bb1f9845d7" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2074.743372] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405486, 'name': CreateVM_Task, 'duration_secs': 0.290174} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2074.743535] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2074.744193] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2074.744350] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2074.744654] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2074.744890] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6ff5c931-1a09-403a-a81d-d042d94824df {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2074.749074] env[62277]: DEBUG oslo_vmware.api [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for the task: (returnval){ [ 2074.749074] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]529d5535-6adf-e986-4c3b-8b3b8007309b" [ 2074.749074] env[62277]: _type = "Task" [ 2074.749074] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2074.756148] env[62277]: DEBUG oslo_vmware.api [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]529d5535-6adf-e986-4c3b-8b3b8007309b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2074.900375] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2075.168976] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2075.259829] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2075.260094] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2075.260307] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2077.163457] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2077.186336] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2077.186691] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 2089.211902] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e3ae5e-70b8-4ba1-88c0-58a896429b9c tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "de543a46-26c3-40b3-9ccd-80bb1f9845d7" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2117.279585] env[62277]: WARNING oslo_vmware.rw_handles [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2117.279585] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2117.279585] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2117.279585] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2117.279585] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2117.279585] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 2117.279585] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2117.279585] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2117.279585] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2117.279585] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2117.279585] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2117.279585] env[62277]: ERROR oslo_vmware.rw_handles [ 2117.280326] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/9965fd45-cc0a-4b4d-8bc4-7aab2ef65335/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2117.282234] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2117.282521] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Copying Virtual Disk [datastore2] vmware_temp/9965fd45-cc0a-4b4d-8bc4-7aab2ef65335/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/9965fd45-cc0a-4b4d-8bc4-7aab2ef65335/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2117.282844] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-030945f1-eff4-42ec-9d44-a5a1e008a7d9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2117.291321] env[62277]: DEBUG oslo_vmware.api [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for the task: (returnval){ [ 2117.291321] env[62277]: value = "task-1405487" [ 2117.291321] env[62277]: _type = "Task" [ 2117.291321] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2117.298994] env[62277]: DEBUG oslo_vmware.api [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Task: {'id': task-1405487, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2117.802092] env[62277]: DEBUG oslo_vmware.exceptions [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2117.802389] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2117.802923] env[62277]: ERROR nova.compute.manager [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2117.802923] env[62277]: Faults: ['InvalidArgument'] [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Traceback (most recent call last): [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] yield resources [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] self.driver.spawn(context, instance, image_meta, [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] self._fetch_image_if_missing(context, vi) [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] image_cache(vi, tmp_image_ds_loc) [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] vm_util.copy_virtual_disk( [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] session._wait_for_task(vmdk_copy_task) [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] return self.wait_for_task(task_ref) [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] return evt.wait() [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] result = hub.switch() [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] return self.greenlet.switch() [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] self.f(*self.args, **self.kw) [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] raise exceptions.translate_fault(task_info.error) [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Faults: ['InvalidArgument'] [ 2117.802923] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] [ 2117.804100] env[62277]: INFO nova.compute.manager [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Terminating instance [ 2117.804719] env[62277]: DEBUG oslo_concurrency.lockutils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2117.804915] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2117.805165] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c0163969-39ba-406f-823c-2f374f091722 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2117.807235] env[62277]: DEBUG nova.compute.manager [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2117.807622] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2117.808137] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-046aae77-e9f4-4293-a0a1-50ce92341a6d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2117.814572] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2117.814795] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-04747b35-f60e-4d21-8740-623ec20905cd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2117.816833] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2117.817000] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2117.817919] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fd6478c9-93bb-4376-9096-b8fe2504f716 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2117.822279] env[62277]: DEBUG oslo_vmware.api [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Waiting for the task: (returnval){ [ 2117.822279] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52d611de-fd48-a629-b3c2-00931ea4a01e" [ 2117.822279] env[62277]: _type = "Task" [ 2117.822279] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2117.830423] env[62277]: DEBUG oslo_vmware.api [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52d611de-fd48-a629-b3c2-00931ea4a01e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2117.888578] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2117.888790] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2117.888991] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Deleting the datastore file [datastore2] 42005809-1926-44b2-8ef6-3b6cb28a4020 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2117.889273] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fe797ec5-76ec-4077-8745-66b65e2f0781 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2117.895363] env[62277]: DEBUG oslo_vmware.api [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for the task: (returnval){ [ 2117.895363] env[62277]: value = "task-1405489" [ 2117.895363] env[62277]: _type = "Task" [ 2117.895363] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2117.902735] env[62277]: DEBUG oslo_vmware.api [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Task: {'id': task-1405489, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2118.332473] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2118.332868] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Creating directory with path [datastore2] vmware_temp/69fec854-aa0f-4f21-aff3-964d618bfffc/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2118.332949] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7d15a5df-2efd-4474-9cba-feec8fbafd75 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2118.343586] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Created directory with path [datastore2] vmware_temp/69fec854-aa0f-4f21-aff3-964d618bfffc/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2118.343775] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Fetch image to [datastore2] vmware_temp/69fec854-aa0f-4f21-aff3-964d618bfffc/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2118.343938] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/69fec854-aa0f-4f21-aff3-964d618bfffc/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2118.344650] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6695c5b9-2ff9-4948-aa76-b30fc1a73789 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2118.350898] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33903c3f-0463-4669-b0a9-6836ddd6d8b5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2118.359878] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ece8834-5fb9-4f55-ad59-dc84c7f6e092 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2118.389330] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2e2d797-5122-4907-9f93-e7faf2cef2f5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2118.394589] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7d30f7fb-13b7-4fa6-8b05-9c39d9c95399 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2118.403849] env[62277]: DEBUG oslo_vmware.api [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Task: {'id': task-1405489, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06341} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2118.404086] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2118.404268] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2118.404432] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2118.404608] env[62277]: INFO nova.compute.manager [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2118.406653] env[62277]: DEBUG nova.compute.claims [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2118.406821] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2118.407043] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2118.422983] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2118.473053] env[62277]: DEBUG oslo_vmware.rw_handles [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/69fec854-aa0f-4f21-aff3-964d618bfffc/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2118.532731] env[62277]: DEBUG oslo_vmware.rw_handles [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2118.532916] env[62277]: DEBUG oslo_vmware.rw_handles [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/69fec854-aa0f-4f21-aff3-964d618bfffc/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2118.655305] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62f90661-c66a-45eb-95e1-82c84217b350 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2118.663143] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e74d4cb6-e553-40b8-8c8b-941f70cf00f5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2118.693095] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52f3143d-3eb2-4e1f-819f-356d3fc8e0f3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2118.700014] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3642715-7277-423e-b24a-217ae2ef5ecd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2118.712898] env[62277]: DEBUG nova.compute.provider_tree [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2118.721449] env[62277]: DEBUG nova.scheduler.client.report [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2118.737072] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.330s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2118.737599] env[62277]: ERROR nova.compute.manager [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2118.737599] env[62277]: Faults: ['InvalidArgument'] [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Traceback (most recent call last): [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] self.driver.spawn(context, instance, image_meta, [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] self._fetch_image_if_missing(context, vi) [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] image_cache(vi, tmp_image_ds_loc) [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] vm_util.copy_virtual_disk( [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] session._wait_for_task(vmdk_copy_task) [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] return self.wait_for_task(task_ref) [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] return evt.wait() [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] result = hub.switch() [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] return self.greenlet.switch() [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] self.f(*self.args, **self.kw) [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] raise exceptions.translate_fault(task_info.error) [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Faults: ['InvalidArgument'] [ 2118.737599] env[62277]: ERROR nova.compute.manager [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] [ 2118.738530] env[62277]: DEBUG nova.compute.utils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2118.741453] env[62277]: DEBUG nova.compute.manager [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Build of instance 42005809-1926-44b2-8ef6-3b6cb28a4020 was re-scheduled: A specified parameter was not correct: fileType [ 2118.741453] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2118.741830] env[62277]: DEBUG nova.compute.manager [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2118.742033] env[62277]: DEBUG nova.compute.manager [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2118.742233] env[62277]: DEBUG nova.compute.manager [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2118.742397] env[62277]: DEBUG nova.network.neutron [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2119.164301] env[62277]: DEBUG nova.network.neutron [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2119.180736] env[62277]: INFO nova.compute.manager [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Took 0.44 seconds to deallocate network for instance. [ 2119.282038] env[62277]: INFO nova.scheduler.client.report [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Deleted allocations for instance 42005809-1926-44b2-8ef6-3b6cb28a4020 [ 2119.302065] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5167fb65-dfb6-44e7-8cb6-d26bce9740fd tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "42005809-1926-44b2-8ef6-3b6cb28a4020" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 620.143s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2119.302866] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5a8d7a5c-20c5-4603-904d-25e24a06d75d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "42005809-1926-44b2-8ef6-3b6cb28a4020" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 424.134s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2119.303095] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5a8d7a5c-20c5-4603-904d-25e24a06d75d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "42005809-1926-44b2-8ef6-3b6cb28a4020-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2119.303308] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5a8d7a5c-20c5-4603-904d-25e24a06d75d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "42005809-1926-44b2-8ef6-3b6cb28a4020-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2119.303485] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5a8d7a5c-20c5-4603-904d-25e24a06d75d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "42005809-1926-44b2-8ef6-3b6cb28a4020-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2119.305388] env[62277]: INFO nova.compute.manager [None req-5a8d7a5c-20c5-4603-904d-25e24a06d75d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Terminating instance [ 2119.307402] env[62277]: DEBUG nova.compute.manager [None req-5a8d7a5c-20c5-4603-904d-25e24a06d75d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2119.307717] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5a8d7a5c-20c5-4603-904d-25e24a06d75d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2119.308202] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-46985659-c8a3-44a2-b49c-b675ecfefa22 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.313592] env[62277]: DEBUG nova.compute.manager [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2119.319986] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d7b835e-f0cd-4357-918f-1d9271d27b67 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.348978] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-5a8d7a5c-20c5-4603-904d-25e24a06d75d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 42005809-1926-44b2-8ef6-3b6cb28a4020 could not be found. [ 2119.349329] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-5a8d7a5c-20c5-4603-904d-25e24a06d75d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2119.349329] env[62277]: INFO nova.compute.manager [None req-5a8d7a5c-20c5-4603-904d-25e24a06d75d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2119.349546] env[62277]: DEBUG oslo.service.loopingcall [None req-5a8d7a5c-20c5-4603-904d-25e24a06d75d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2119.351832] env[62277]: DEBUG nova.compute.manager [-] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2119.351941] env[62277]: DEBUG nova.network.neutron [-] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2119.366445] env[62277]: DEBUG oslo_concurrency.lockutils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2119.366681] env[62277]: DEBUG oslo_concurrency.lockutils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2119.368230] env[62277]: INFO nova.compute.claims [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2119.379080] env[62277]: DEBUG nova.network.neutron [-] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2119.387581] env[62277]: INFO nova.compute.manager [-] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] Took 0.04 seconds to deallocate network for instance. [ 2119.479837] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5a8d7a5c-20c5-4603-904d-25e24a06d75d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "42005809-1926-44b2-8ef6-3b6cb28a4020" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.177s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2119.481008] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "42005809-1926-44b2-8ef6-3b6cb28a4020" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 249.354s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2119.481008] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 42005809-1926-44b2-8ef6-3b6cb28a4020] During sync_power_state the instance has a pending task (deleting). Skip. [ 2119.481192] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "42005809-1926-44b2-8ef6-3b6cb28a4020" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2119.583815] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87872536-cf3b-4a58-88b4-09532f30777e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.592611] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acd98c02-fe20-4080-b27a-5eb403b1189f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.622865] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-133f5a75-67df-4924-8d28-56197d726078 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.630015] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-918bbee4-6e8c-451f-b245-18fa0c9db71c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.642737] env[62277]: DEBUG nova.compute.provider_tree [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2119.654899] env[62277]: DEBUG nova.scheduler.client.report [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2119.667674] env[62277]: DEBUG oslo_concurrency.lockutils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.301s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2119.668127] env[62277]: DEBUG nova.compute.manager [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2119.707846] env[62277]: DEBUG nova.compute.utils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2119.709139] env[62277]: DEBUG nova.compute.manager [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2119.709337] env[62277]: DEBUG nova.network.neutron [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2119.718684] env[62277]: DEBUG nova.compute.manager [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2119.766326] env[62277]: DEBUG nova.policy [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3a834d1a58b94907bc6944154314dce9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '24482eabb41e4102a26c9e7576a49c33', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 2119.779900] env[62277]: DEBUG nova.compute.manager [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2119.806675] env[62277]: DEBUG nova.virt.hardware [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2119.806936] env[62277]: DEBUG nova.virt.hardware [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2119.807105] env[62277]: DEBUG nova.virt.hardware [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2119.807306] env[62277]: DEBUG nova.virt.hardware [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2119.807425] env[62277]: DEBUG nova.virt.hardware [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2119.807567] env[62277]: DEBUG nova.virt.hardware [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2119.807767] env[62277]: DEBUG nova.virt.hardware [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2119.807948] env[62277]: DEBUG nova.virt.hardware [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2119.808211] env[62277]: DEBUG nova.virt.hardware [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2119.808386] env[62277]: DEBUG nova.virt.hardware [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2119.808553] env[62277]: DEBUG nova.virt.hardware [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2119.809436] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b4ecb65-b516-49c9-a0eb-0bec1e1ebc63 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2119.818518] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-030d8739-215b-4925-ad66-66a350791862 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2120.170676] env[62277]: DEBUG nova.network.neutron [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Successfully created port: 369565c4-7873-4b9f-aeb5-09f68fdaa1bf {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2121.008898] env[62277]: DEBUG nova.network.neutron [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Successfully updated port: 369565c4-7873-4b9f-aeb5-09f68fdaa1bf {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2121.021633] env[62277]: DEBUG oslo_concurrency.lockutils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "refresh_cache-297d53df-7918-4389-9c63-a600755da969" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2121.021778] env[62277]: DEBUG oslo_concurrency.lockutils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquired lock "refresh_cache-297d53df-7918-4389-9c63-a600755da969" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2121.022076] env[62277]: DEBUG nova.network.neutron [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2121.065790] env[62277]: DEBUG nova.network.neutron [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2121.243729] env[62277]: DEBUG nova.compute.manager [req-587f2f22-cdba-4b79-9897-87a495b77dd5 req-1ff81a81-c7cc-4ab8-87d2-3993f570248e service nova] [instance: 297d53df-7918-4389-9c63-a600755da969] Received event network-vif-plugged-369565c4-7873-4b9f-aeb5-09f68fdaa1bf {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2121.243945] env[62277]: DEBUG oslo_concurrency.lockutils [req-587f2f22-cdba-4b79-9897-87a495b77dd5 req-1ff81a81-c7cc-4ab8-87d2-3993f570248e service nova] Acquiring lock "297d53df-7918-4389-9c63-a600755da969-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2121.244157] env[62277]: DEBUG oslo_concurrency.lockutils [req-587f2f22-cdba-4b79-9897-87a495b77dd5 req-1ff81a81-c7cc-4ab8-87d2-3993f570248e service nova] Lock "297d53df-7918-4389-9c63-a600755da969-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2121.244318] env[62277]: DEBUG oslo_concurrency.lockutils [req-587f2f22-cdba-4b79-9897-87a495b77dd5 req-1ff81a81-c7cc-4ab8-87d2-3993f570248e service nova] Lock "297d53df-7918-4389-9c63-a600755da969-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2121.244478] env[62277]: DEBUG nova.compute.manager [req-587f2f22-cdba-4b79-9897-87a495b77dd5 req-1ff81a81-c7cc-4ab8-87d2-3993f570248e service nova] [instance: 297d53df-7918-4389-9c63-a600755da969] No waiting events found dispatching network-vif-plugged-369565c4-7873-4b9f-aeb5-09f68fdaa1bf {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2121.244637] env[62277]: WARNING nova.compute.manager [req-587f2f22-cdba-4b79-9897-87a495b77dd5 req-1ff81a81-c7cc-4ab8-87d2-3993f570248e service nova] [instance: 297d53df-7918-4389-9c63-a600755da969] Received unexpected event network-vif-plugged-369565c4-7873-4b9f-aeb5-09f68fdaa1bf for instance with vm_state building and task_state spawning. [ 2121.244792] env[62277]: DEBUG nova.compute.manager [req-587f2f22-cdba-4b79-9897-87a495b77dd5 req-1ff81a81-c7cc-4ab8-87d2-3993f570248e service nova] [instance: 297d53df-7918-4389-9c63-a600755da969] Received event network-changed-369565c4-7873-4b9f-aeb5-09f68fdaa1bf {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2121.244939] env[62277]: DEBUG nova.compute.manager [req-587f2f22-cdba-4b79-9897-87a495b77dd5 req-1ff81a81-c7cc-4ab8-87d2-3993f570248e service nova] [instance: 297d53df-7918-4389-9c63-a600755da969] Refreshing instance network info cache due to event network-changed-369565c4-7873-4b9f-aeb5-09f68fdaa1bf. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 2121.245140] env[62277]: DEBUG oslo_concurrency.lockutils [req-587f2f22-cdba-4b79-9897-87a495b77dd5 req-1ff81a81-c7cc-4ab8-87d2-3993f570248e service nova] Acquiring lock "refresh_cache-297d53df-7918-4389-9c63-a600755da969" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2121.265830] env[62277]: DEBUG nova.network.neutron [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Updating instance_info_cache with network_info: [{"id": "369565c4-7873-4b9f-aeb5-09f68fdaa1bf", "address": "fa:16:3e:3f:75:48", "network": {"id": "83a53f5b-0798-4c93-9294-0cdb526dc3ca", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1943573639-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "24482eabb41e4102a26c9e7576a49c33", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f85835c8-5d0c-4b2f-97c4-6c4006580f79", "external-id": "nsx-vlan-transportzone-245", "segmentation_id": 245, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap369565c4-78", "ovs_interfaceid": "369565c4-7873-4b9f-aeb5-09f68fdaa1bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2121.279334] env[62277]: DEBUG oslo_concurrency.lockutils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Releasing lock "refresh_cache-297d53df-7918-4389-9c63-a600755da969" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2121.279625] env[62277]: DEBUG nova.compute.manager [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Instance network_info: |[{"id": "369565c4-7873-4b9f-aeb5-09f68fdaa1bf", "address": "fa:16:3e:3f:75:48", "network": {"id": "83a53f5b-0798-4c93-9294-0cdb526dc3ca", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1943573639-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "24482eabb41e4102a26c9e7576a49c33", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f85835c8-5d0c-4b2f-97c4-6c4006580f79", "external-id": "nsx-vlan-transportzone-245", "segmentation_id": 245, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap369565c4-78", "ovs_interfaceid": "369565c4-7873-4b9f-aeb5-09f68fdaa1bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2121.279922] env[62277]: DEBUG oslo_concurrency.lockutils [req-587f2f22-cdba-4b79-9897-87a495b77dd5 req-1ff81a81-c7cc-4ab8-87d2-3993f570248e service nova] Acquired lock "refresh_cache-297d53df-7918-4389-9c63-a600755da969" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2121.280107] env[62277]: DEBUG nova.network.neutron [req-587f2f22-cdba-4b79-9897-87a495b77dd5 req-1ff81a81-c7cc-4ab8-87d2-3993f570248e service nova] [instance: 297d53df-7918-4389-9c63-a600755da969] Refreshing network info cache for port 369565c4-7873-4b9f-aeb5-09f68fdaa1bf {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2121.281194] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3f:75:48', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f85835c8-5d0c-4b2f-97c4-6c4006580f79', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '369565c4-7873-4b9f-aeb5-09f68fdaa1bf', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2121.288874] env[62277]: DEBUG oslo.service.loopingcall [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2121.289748] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 297d53df-7918-4389-9c63-a600755da969] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2121.292404] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bdc49934-2894-49b5-b6af-df34a5444980 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2121.312751] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2121.312751] env[62277]: value = "task-1405490" [ 2121.312751] env[62277]: _type = "Task" [ 2121.312751] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2121.320779] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405490, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2121.576556] env[62277]: DEBUG nova.network.neutron [req-587f2f22-cdba-4b79-9897-87a495b77dd5 req-1ff81a81-c7cc-4ab8-87d2-3993f570248e service nova] [instance: 297d53df-7918-4389-9c63-a600755da969] Updated VIF entry in instance network info cache for port 369565c4-7873-4b9f-aeb5-09f68fdaa1bf. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2121.576914] env[62277]: DEBUG nova.network.neutron [req-587f2f22-cdba-4b79-9897-87a495b77dd5 req-1ff81a81-c7cc-4ab8-87d2-3993f570248e service nova] [instance: 297d53df-7918-4389-9c63-a600755da969] Updating instance_info_cache with network_info: [{"id": "369565c4-7873-4b9f-aeb5-09f68fdaa1bf", "address": "fa:16:3e:3f:75:48", "network": {"id": "83a53f5b-0798-4c93-9294-0cdb526dc3ca", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1943573639-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "24482eabb41e4102a26c9e7576a49c33", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f85835c8-5d0c-4b2f-97c4-6c4006580f79", "external-id": "nsx-vlan-transportzone-245", "segmentation_id": 245, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap369565c4-78", "ovs_interfaceid": "369565c4-7873-4b9f-aeb5-09f68fdaa1bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2121.586427] env[62277]: DEBUG oslo_concurrency.lockutils [req-587f2f22-cdba-4b79-9897-87a495b77dd5 req-1ff81a81-c7cc-4ab8-87d2-3993f570248e service nova] Releasing lock "refresh_cache-297d53df-7918-4389-9c63-a600755da969" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2121.823089] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405490, 'name': CreateVM_Task, 'duration_secs': 0.262243} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2121.823259] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 297d53df-7918-4389-9c63-a600755da969] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2121.823961] env[62277]: DEBUG oslo_concurrency.lockutils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2121.824134] env[62277]: DEBUG oslo_concurrency.lockutils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2121.824443] env[62277]: DEBUG oslo_concurrency.lockutils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2121.824682] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-249bca3a-4df3-4806-b46a-b71c37245162 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2121.829038] env[62277]: DEBUG oslo_vmware.api [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Waiting for the task: (returnval){ [ 2121.829038] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52a26657-cc02-9954-8397-54034e2bbb0c" [ 2121.829038] env[62277]: _type = "Task" [ 2121.829038] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2121.836209] env[62277]: DEBUG oslo_vmware.api [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52a26657-cc02-9954-8397-54034e2bbb0c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2122.340693] env[62277]: DEBUG oslo_concurrency.lockutils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2122.340963] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2122.341183] env[62277]: DEBUG oslo_concurrency.lockutils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2125.169652] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2125.170026] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Cleaning up deleted instances {{(pid=62277) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11196}} [ 2125.180810] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] There are 0 instances to clean {{(pid=62277) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11205}} [ 2128.168980] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2128.169281] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Cleaning up deleted instances with incomplete migration {{(pid=62277) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11234}} [ 2130.178616] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2132.168045] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2132.168045] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2133.164387] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2134.168352] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2134.168614] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 2134.168652] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 2134.192947] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2134.193122] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2134.193250] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2134.193377] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2134.193499] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2134.193618] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2134.193736] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2134.193851] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2134.193965] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2134.194189] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 297d53df-7918-4389-9c63-a600755da969] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2134.194329] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 2134.194848] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2134.204975] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2134.205194] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2134.205458] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2134.205495] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2134.206614] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09151a46-8af1-429f-af75-add18eea1c84 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.215507] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0fd8689-17a6-4fad-ace8-1c36a4cd1c9b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.229267] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c91e8394-fa45-48db-8c36-de9a3b659e97 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.235683] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83a8eb18-58d0-4710-a808-99fe0be03627 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.264363] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181395MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2134.264532] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2134.264685] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2134.426631] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2134.426800] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2134.426927] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 163eb4e7-33f8-4674-8a3f-5094356e250d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2134.427062] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 400beb27-a709-4ef4-851e-5caaab9ca60b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2134.427183] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9f7d4431-d5ea-4f9b-888b-77a6a7772047 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2134.427299] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8b9ef530-e79f-4cd4-8a88-83871ed65f90 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2134.427413] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c4c22c8a-4300-45ce-8484-77c638f7bbc5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2134.427524] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance ad83bf06-d712-4bd4-8086-9c3b615adaf5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2134.427634] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance de543a46-26c3-40b3-9ccd-80bb1f9845d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2134.427747] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 297d53df-7918-4389-9c63-a600755da969 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2134.438787] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 940561d5-723b-4e43-8fab-35e8af95ce09 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2134.450824] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6737c3b9-d9e6-4879-a6df-46d3c7dee40e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2134.462828] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 40a309bf-6b7f-4360-a083-640db68bb00b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2134.462828] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2134.462828] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2134.476192] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Refreshing inventories for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2134.490116] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Updating ProviderTree inventory for provider 75e125ea-a599-4b65-b9cd-6ea881735292 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2134.490302] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Updating inventory in ProviderTree for provider 75e125ea-a599-4b65-b9cd-6ea881735292 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2134.500502] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Refreshing aggregate associations for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292, aggregates: None {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2134.517781] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Refreshing trait associations for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2134.659076] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75eecc7a-3959-4b1b-a25f-bc2a2555fca3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.666533] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f80ab373-e72e-43a8-911c-a9244e7328f8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.695572] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9521aad-9103-4795-ae64-a31cb02872a6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.702451] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4fbf033-3262-4784-8332-3eaa1de30524 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2134.714899] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2134.723064] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2134.738025] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2134.738025] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.473s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2135.711972] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2135.712365] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2138.169648] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2138.169986] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 2141.168865] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2164.993315] env[62277]: WARNING oslo_vmware.rw_handles [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2164.993315] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2164.993315] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2164.993315] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2164.993315] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2164.993315] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 2164.993315] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2164.993315] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2164.993315] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2164.993315] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2164.993315] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2164.993315] env[62277]: ERROR oslo_vmware.rw_handles [ 2164.994062] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/69fec854-aa0f-4f21-aff3-964d618bfffc/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2164.995813] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2164.996079] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Copying Virtual Disk [datastore2] vmware_temp/69fec854-aa0f-4f21-aff3-964d618bfffc/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/69fec854-aa0f-4f21-aff3-964d618bfffc/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2164.996368] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f944d9ff-8f7b-4e78-899f-f2744083a311 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2165.004352] env[62277]: DEBUG oslo_vmware.api [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Waiting for the task: (returnval){ [ 2165.004352] env[62277]: value = "task-1405491" [ 2165.004352] env[62277]: _type = "Task" [ 2165.004352] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2165.012013] env[62277]: DEBUG oslo_vmware.api [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Task: {'id': task-1405491, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2165.514238] env[62277]: DEBUG oslo_vmware.exceptions [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2165.514511] env[62277]: DEBUG oslo_concurrency.lockutils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2165.515055] env[62277]: ERROR nova.compute.manager [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2165.515055] env[62277]: Faults: ['InvalidArgument'] [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Traceback (most recent call last): [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] yield resources [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] self.driver.spawn(context, instance, image_meta, [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] self._fetch_image_if_missing(context, vi) [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] image_cache(vi, tmp_image_ds_loc) [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] vm_util.copy_virtual_disk( [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] session._wait_for_task(vmdk_copy_task) [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] return self.wait_for_task(task_ref) [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] return evt.wait() [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] result = hub.switch() [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] return self.greenlet.switch() [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] self.f(*self.args, **self.kw) [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] raise exceptions.translate_fault(task_info.error) [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Faults: ['InvalidArgument'] [ 2165.515055] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] [ 2165.516378] env[62277]: INFO nova.compute.manager [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Terminating instance [ 2165.516818] env[62277]: DEBUG oslo_concurrency.lockutils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2165.517029] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2165.517270] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-431e461f-e70b-4d71-a31e-5477738c1332 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2165.519546] env[62277]: DEBUG nova.compute.manager [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2165.519758] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2165.520475] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-411fc229-0bf5-48a7-83a3-b93dbad027a4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2165.527026] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2165.527220] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0199ad75-e243-4d2a-8965-86e3d2a025d0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2165.529335] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2165.529511] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2165.530423] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d429feb6-0c93-4266-b756-05fd5dbf6d35 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2165.535068] env[62277]: DEBUG oslo_vmware.api [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Waiting for the task: (returnval){ [ 2165.535068] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]529f92a9-339e-51b1-bd89-f1691bb3c151" [ 2165.535068] env[62277]: _type = "Task" [ 2165.535068] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2165.542171] env[62277]: DEBUG oslo_vmware.api [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]529f92a9-339e-51b1-bd89-f1691bb3c151, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2165.591224] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2165.591525] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2165.591833] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Deleting the datastore file [datastore2] b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2165.592199] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9440ed1f-5819-4e37-9149-8d95feb878b7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2165.598708] env[62277]: DEBUG oslo_vmware.api [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Waiting for the task: (returnval){ [ 2165.598708] env[62277]: value = "task-1405493" [ 2165.598708] env[62277]: _type = "Task" [ 2165.598708] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2165.606073] env[62277]: DEBUG oslo_vmware.api [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Task: {'id': task-1405493, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2166.046218] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2166.046527] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Creating directory with path [datastore2] vmware_temp/eab6ee2d-f72a-449e-a310-b5f8309e9112/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2166.046728] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e2f70285-a060-49c5-9af2-308289d78d40 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.057829] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Created directory with path [datastore2] vmware_temp/eab6ee2d-f72a-449e-a310-b5f8309e9112/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2166.058018] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Fetch image to [datastore2] vmware_temp/eab6ee2d-f72a-449e-a310-b5f8309e9112/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2166.058197] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/eab6ee2d-f72a-449e-a310-b5f8309e9112/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2166.058919] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a29d8feb-e64b-4411-a7aa-76b152129db7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.065422] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02bf44d3-c4ed-4dc7-8df2-10ac77b1b784 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.074292] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad8eb408-2438-43e6-a070-f3158fcf212f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.108171] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16f5f0f0-2f0e-4d22-b153-7f9bb93273ab {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.114968] env[62277]: DEBUG oslo_vmware.api [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Task: {'id': task-1405493, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077406} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2166.116436] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2166.116625] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2166.116792] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2166.116961] env[62277]: INFO nova.compute.manager [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2166.118680] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ac5c4673-0233-4ff8-83a7-5b8f6e797f21 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.120566] env[62277]: DEBUG nova.compute.claims [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2166.120738] env[62277]: DEBUG oslo_concurrency.lockutils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2166.120946] env[62277]: DEBUG oslo_concurrency.lockutils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2166.142009] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2166.289440] env[62277]: DEBUG oslo_vmware.rw_handles [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/eab6ee2d-f72a-449e-a310-b5f8309e9112/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2166.348945] env[62277]: DEBUG oslo_vmware.rw_handles [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2166.349154] env[62277]: DEBUG oslo_vmware.rw_handles [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/eab6ee2d-f72a-449e-a310-b5f8309e9112/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2166.369913] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a6b7ea8-a5c8-48a7-9b17-80b971aad568 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.377267] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-627a97be-c511-468d-ac3a-ba5686867c0c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.406177] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ae86cf0-6848-4914-8c7c-d5fdfc508c6e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.413115] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d504383-728f-4d98-89d0-18c28e379ef6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2166.425729] env[62277]: DEBUG nova.compute.provider_tree [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2166.433741] env[62277]: DEBUG nova.scheduler.client.report [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2166.448152] env[62277]: DEBUG oslo_concurrency.lockutils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.327s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2166.448682] env[62277]: ERROR nova.compute.manager [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2166.448682] env[62277]: Faults: ['InvalidArgument'] [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Traceback (most recent call last): [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] self.driver.spawn(context, instance, image_meta, [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] self._fetch_image_if_missing(context, vi) [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] image_cache(vi, tmp_image_ds_loc) [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] vm_util.copy_virtual_disk( [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] session._wait_for_task(vmdk_copy_task) [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] return self.wait_for_task(task_ref) [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] return evt.wait() [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] result = hub.switch() [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] return self.greenlet.switch() [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] self.f(*self.args, **self.kw) [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] raise exceptions.translate_fault(task_info.error) [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Faults: ['InvalidArgument'] [ 2166.448682] env[62277]: ERROR nova.compute.manager [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] [ 2166.449719] env[62277]: DEBUG nova.compute.utils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2166.451222] env[62277]: DEBUG nova.compute.manager [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Build of instance b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d was re-scheduled: A specified parameter was not correct: fileType [ 2166.451222] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2166.451621] env[62277]: DEBUG nova.compute.manager [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2166.451793] env[62277]: DEBUG nova.compute.manager [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2166.451959] env[62277]: DEBUG nova.compute.manager [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2166.452132] env[62277]: DEBUG nova.network.neutron [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2166.987521] env[62277]: DEBUG nova.network.neutron [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2167.000402] env[62277]: INFO nova.compute.manager [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Took 0.55 seconds to deallocate network for instance. [ 2167.085659] env[62277]: INFO nova.scheduler.client.report [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Deleted allocations for instance b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d [ 2167.107578] env[62277]: DEBUG oslo_concurrency.lockutils [None req-946b5031-b21a-4869-ad8c-0b6737378c00 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 635.005s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2167.108739] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 296.982s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2167.108949] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] During sync_power_state the instance has a pending task (spawning). Skip. [ 2167.109145] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2167.109706] env[62277]: DEBUG oslo_concurrency.lockutils [None req-d5c0f938-9ce0-421d-896b-33de135b567b tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 241.310s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2167.109825] env[62277]: DEBUG oslo_concurrency.lockutils [None req-d5c0f938-9ce0-421d-896b-33de135b567b tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquiring lock "b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2167.110366] env[62277]: DEBUG oslo_concurrency.lockutils [None req-d5c0f938-9ce0-421d-896b-33de135b567b tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2167.111981] env[62277]: DEBUG oslo_concurrency.lockutils [None req-d5c0f938-9ce0-421d-896b-33de135b567b tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2167.112519] env[62277]: INFO nova.compute.manager [None req-d5c0f938-9ce0-421d-896b-33de135b567b tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Terminating instance [ 2167.114310] env[62277]: DEBUG nova.compute.manager [None req-d5c0f938-9ce0-421d-896b-33de135b567b tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2167.114437] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-d5c0f938-9ce0-421d-896b-33de135b567b tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2167.115106] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-979c4a4d-a39f-49c2-b662-a5d275e126e5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.119658] env[62277]: DEBUG nova.compute.manager [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2167.126528] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49232b14-ce10-4dc4-9843-3b8f391db577 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.155510] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-d5c0f938-9ce0-421d-896b-33de135b567b tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d could not be found. [ 2167.155668] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-d5c0f938-9ce0-421d-896b-33de135b567b tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2167.156750] env[62277]: INFO nova.compute.manager [None req-d5c0f938-9ce0-421d-896b-33de135b567b tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2167.156750] env[62277]: DEBUG oslo.service.loopingcall [None req-d5c0f938-9ce0-421d-896b-33de135b567b tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2167.160400] env[62277]: DEBUG nova.compute.manager [-] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2167.160587] env[62277]: DEBUG nova.network.neutron [-] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2167.172111] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2167.172365] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2167.173838] env[62277]: INFO nova.compute.claims [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2167.189993] env[62277]: DEBUG nova.network.neutron [-] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2167.207355] env[62277]: INFO nova.compute.manager [-] [instance: b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d] Took 0.05 seconds to deallocate network for instance. [ 2167.291916] env[62277]: DEBUG oslo_concurrency.lockutils [None req-d5c0f938-9ce0-421d-896b-33de135b567b tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "b6dddee5-328c-4a51-9ce2-fbfd5c50aa9d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.182s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2167.353862] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b2b519f-8d0e-4616-b820-da40edc072d7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.361234] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70d4d0bf-b564-4554-8a73-8771367ee050 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.392382] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-198ffad7-c63b-496e-b8f6-13c39f1aaf67 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.399251] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e18603da-9ded-4202-892e-afc37a7115c1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.411990] env[62277]: DEBUG nova.compute.provider_tree [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2167.420546] env[62277]: DEBUG nova.scheduler.client.report [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2167.433686] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.261s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2167.434129] env[62277]: DEBUG nova.compute.manager [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2167.464925] env[62277]: DEBUG nova.compute.utils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2167.465054] env[62277]: DEBUG nova.compute.manager [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2167.465196] env[62277]: DEBUG nova.network.neutron [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2167.474047] env[62277]: DEBUG nova.compute.manager [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2167.520594] env[62277]: DEBUG nova.policy [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '92ac95b3fd0e4fff9b84b0c796c93c56', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ae9d3e1015a54d54873109ff0650210f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 2167.535250] env[62277]: DEBUG nova.compute.manager [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2167.561911] env[62277]: DEBUG nova.virt.hardware [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2167.562188] env[62277]: DEBUG nova.virt.hardware [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2167.562344] env[62277]: DEBUG nova.virt.hardware [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2167.562520] env[62277]: DEBUG nova.virt.hardware [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2167.562977] env[62277]: DEBUG nova.virt.hardware [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2167.562977] env[62277]: DEBUG nova.virt.hardware [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2167.562977] env[62277]: DEBUG nova.virt.hardware [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2167.563202] env[62277]: DEBUG nova.virt.hardware [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2167.563311] env[62277]: DEBUG nova.virt.hardware [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2167.563471] env[62277]: DEBUG nova.virt.hardware [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2167.563641] env[62277]: DEBUG nova.virt.hardware [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2167.564579] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ea594a5-8b6b-420b-a112-13781471667e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.572938] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d93b47d1-62d7-402f-85bb-633750c5f62b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.822238] env[62277]: DEBUG nova.network.neutron [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Successfully created port: 1acbadf4-4879-4f3c-b22d-13bd9a1a0e43 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2168.433538] env[62277]: DEBUG nova.network.neutron [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Successfully updated port: 1acbadf4-4879-4f3c-b22d-13bd9a1a0e43 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2168.444911] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Acquiring lock "refresh_cache-940561d5-723b-4e43-8fab-35e8af95ce09" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2168.445283] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Acquired lock "refresh_cache-940561d5-723b-4e43-8fab-35e8af95ce09" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2168.445283] env[62277]: DEBUG nova.network.neutron [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2168.483687] env[62277]: DEBUG nova.network.neutron [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2168.686170] env[62277]: DEBUG nova.network.neutron [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Updating instance_info_cache with network_info: [{"id": "1acbadf4-4879-4f3c-b22d-13bd9a1a0e43", "address": "fa:16:3e:3a:37:f7", "network": {"id": "8136627a-e9c1-49b0-83e4-c5a109843cb6", "bridge": "br-int", "label": "tempest-ImagesTestJSON-296368702-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ae9d3e1015a54d54873109ff0650210f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8cbc9b8f-ce19-4262-bf4d-88cd4f259a1c", "external-id": "nsx-vlan-transportzone-630", "segmentation_id": 630, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1acbadf4-48", "ovs_interfaceid": "1acbadf4-4879-4f3c-b22d-13bd9a1a0e43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2168.696795] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Releasing lock "refresh_cache-940561d5-723b-4e43-8fab-35e8af95ce09" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2168.697121] env[62277]: DEBUG nova.compute.manager [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Instance network_info: |[{"id": "1acbadf4-4879-4f3c-b22d-13bd9a1a0e43", "address": "fa:16:3e:3a:37:f7", "network": {"id": "8136627a-e9c1-49b0-83e4-c5a109843cb6", "bridge": "br-int", "label": "tempest-ImagesTestJSON-296368702-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ae9d3e1015a54d54873109ff0650210f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8cbc9b8f-ce19-4262-bf4d-88cd4f259a1c", "external-id": "nsx-vlan-transportzone-630", "segmentation_id": 630, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1acbadf4-48", "ovs_interfaceid": "1acbadf4-4879-4f3c-b22d-13bd9a1a0e43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2168.697507] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3a:37:f7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8cbc9b8f-ce19-4262-bf4d-88cd4f259a1c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1acbadf4-4879-4f3c-b22d-13bd9a1a0e43', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2168.705049] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Creating folder: Project (ae9d3e1015a54d54873109ff0650210f). Parent ref: group-v297781. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2168.705579] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f074a3a3-cd09-4c50-918e-c44c7a59ee06 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.715972] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Created folder: Project (ae9d3e1015a54d54873109ff0650210f) in parent group-v297781. [ 2168.716165] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Creating folder: Instances. Parent ref: group-v297888. {{(pid=62277) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2168.716375] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b5875d7f-9fe2-48f7-9004-9e20c476f92b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.725047] env[62277]: INFO nova.virt.vmwareapi.vm_util [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Created folder: Instances in parent group-v297888. [ 2168.725241] env[62277]: DEBUG oslo.service.loopingcall [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2168.725408] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2168.725584] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c5a80bda-f78f-47b8-9320-e47b239d4434 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2168.743820] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2168.743820] env[62277]: value = "task-1405496" [ 2168.743820] env[62277]: _type = "Task" [ 2168.743820] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2168.750683] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405496, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2169.121588] env[62277]: DEBUG nova.compute.manager [req-fe4b8e04-80ad-4a62-82d3-cbabb07f68dc req-464e4056-79f8-4502-afb4-19508b560005 service nova] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Received event network-vif-plugged-1acbadf4-4879-4f3c-b22d-13bd9a1a0e43 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2169.121588] env[62277]: DEBUG oslo_concurrency.lockutils [req-fe4b8e04-80ad-4a62-82d3-cbabb07f68dc req-464e4056-79f8-4502-afb4-19508b560005 service nova] Acquiring lock "940561d5-723b-4e43-8fab-35e8af95ce09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2169.121588] env[62277]: DEBUG oslo_concurrency.lockutils [req-fe4b8e04-80ad-4a62-82d3-cbabb07f68dc req-464e4056-79f8-4502-afb4-19508b560005 service nova] Lock "940561d5-723b-4e43-8fab-35e8af95ce09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2169.121588] env[62277]: DEBUG oslo_concurrency.lockutils [req-fe4b8e04-80ad-4a62-82d3-cbabb07f68dc req-464e4056-79f8-4502-afb4-19508b560005 service nova] Lock "940561d5-723b-4e43-8fab-35e8af95ce09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2169.121791] env[62277]: DEBUG nova.compute.manager [req-fe4b8e04-80ad-4a62-82d3-cbabb07f68dc req-464e4056-79f8-4502-afb4-19508b560005 service nova] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] No waiting events found dispatching network-vif-plugged-1acbadf4-4879-4f3c-b22d-13bd9a1a0e43 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2169.121908] env[62277]: WARNING nova.compute.manager [req-fe4b8e04-80ad-4a62-82d3-cbabb07f68dc req-464e4056-79f8-4502-afb4-19508b560005 service nova] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Received unexpected event network-vif-plugged-1acbadf4-4879-4f3c-b22d-13bd9a1a0e43 for instance with vm_state building and task_state spawning. [ 2169.122078] env[62277]: DEBUG nova.compute.manager [req-fe4b8e04-80ad-4a62-82d3-cbabb07f68dc req-464e4056-79f8-4502-afb4-19508b560005 service nova] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Received event network-changed-1acbadf4-4879-4f3c-b22d-13bd9a1a0e43 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2169.122233] env[62277]: DEBUG nova.compute.manager [req-fe4b8e04-80ad-4a62-82d3-cbabb07f68dc req-464e4056-79f8-4502-afb4-19508b560005 service nova] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Refreshing instance network info cache due to event network-changed-1acbadf4-4879-4f3c-b22d-13bd9a1a0e43. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 2169.122415] env[62277]: DEBUG oslo_concurrency.lockutils [req-fe4b8e04-80ad-4a62-82d3-cbabb07f68dc req-464e4056-79f8-4502-afb4-19508b560005 service nova] Acquiring lock "refresh_cache-940561d5-723b-4e43-8fab-35e8af95ce09" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2169.122549] env[62277]: DEBUG oslo_concurrency.lockutils [req-fe4b8e04-80ad-4a62-82d3-cbabb07f68dc req-464e4056-79f8-4502-afb4-19508b560005 service nova] Acquired lock "refresh_cache-940561d5-723b-4e43-8fab-35e8af95ce09" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2169.122725] env[62277]: DEBUG nova.network.neutron [req-fe4b8e04-80ad-4a62-82d3-cbabb07f68dc req-464e4056-79f8-4502-afb4-19508b560005 service nova] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Refreshing network info cache for port 1acbadf4-4879-4f3c-b22d-13bd9a1a0e43 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2169.253451] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405496, 'name': CreateVM_Task, 'duration_secs': 0.271165} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2169.253642] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2169.254287] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2169.254452] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2169.254775] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2169.255030] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5e5eef88-b234-4ab3-82a8-0d5da6103efa {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2169.262673] env[62277]: DEBUG oslo_vmware.api [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Waiting for the task: (returnval){ [ 2169.262673] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52d4ec22-17ef-4229-3a6c-73e45e19ae77" [ 2169.262673] env[62277]: _type = "Task" [ 2169.262673] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2169.275692] env[62277]: DEBUG oslo_vmware.api [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52d4ec22-17ef-4229-3a6c-73e45e19ae77, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2169.372844] env[62277]: DEBUG nova.network.neutron [req-fe4b8e04-80ad-4a62-82d3-cbabb07f68dc req-464e4056-79f8-4502-afb4-19508b560005 service nova] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Updated VIF entry in instance network info cache for port 1acbadf4-4879-4f3c-b22d-13bd9a1a0e43. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2169.373231] env[62277]: DEBUG nova.network.neutron [req-fe4b8e04-80ad-4a62-82d3-cbabb07f68dc req-464e4056-79f8-4502-afb4-19508b560005 service nova] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Updating instance_info_cache with network_info: [{"id": "1acbadf4-4879-4f3c-b22d-13bd9a1a0e43", "address": "fa:16:3e:3a:37:f7", "network": {"id": "8136627a-e9c1-49b0-83e4-c5a109843cb6", "bridge": "br-int", "label": "tempest-ImagesTestJSON-296368702-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ae9d3e1015a54d54873109ff0650210f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8cbc9b8f-ce19-4262-bf4d-88cd4f259a1c", "external-id": "nsx-vlan-transportzone-630", "segmentation_id": 630, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1acbadf4-48", "ovs_interfaceid": "1acbadf4-4879-4f3c-b22d-13bd9a1a0e43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2169.384624] env[62277]: DEBUG oslo_concurrency.lockutils [req-fe4b8e04-80ad-4a62-82d3-cbabb07f68dc req-464e4056-79f8-4502-afb4-19508b560005 service nova] Releasing lock "refresh_cache-940561d5-723b-4e43-8fab-35e8af95ce09" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2169.772717] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2169.772979] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2169.773231] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2176.998177] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a0f67795-2bbf-4c83-a3ce-b2d8fcd83060 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "297d53df-7918-4389-9c63-a600755da969" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2191.176085] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2192.168832] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2194.164305] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2194.167960] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2194.168131] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 2194.168252] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 2194.189377] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2194.189581] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2194.189659] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2194.189771] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2194.189893] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2194.190120] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2194.190292] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2194.190420] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2194.190539] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 297d53df-7918-4389-9c63-a600755da969] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2194.190655] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2194.190775] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 2194.191235] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2196.169477] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2196.169795] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2196.181940] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2196.182232] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2196.182561] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2196.182561] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2196.183683] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9306c384-c230-4573-b41d-517840dc4a89 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2196.192426] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1257735-df4a-4bce-917c-01f624a6e06b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2196.206543] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c998bff2-9664-4893-960b-7a30d36a1d02 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2196.212809] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44f21330-16b1-4086-8407-74530c3cfc60 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2196.241754] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181415MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2196.241754] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2196.241938] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2196.316428] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2196.316592] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 163eb4e7-33f8-4674-8a3f-5094356e250d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2196.316914] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 400beb27-a709-4ef4-851e-5caaab9ca60b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2196.316914] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9f7d4431-d5ea-4f9b-888b-77a6a7772047 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2196.317065] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8b9ef530-e79f-4cd4-8a88-83871ed65f90 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2196.317065] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c4c22c8a-4300-45ce-8484-77c638f7bbc5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2196.317178] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance ad83bf06-d712-4bd4-8086-9c3b615adaf5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2196.317295] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance de543a46-26c3-40b3-9ccd-80bb1f9845d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2196.317408] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 297d53df-7918-4389-9c63-a600755da969 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2196.317520] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 940561d5-723b-4e43-8fab-35e8af95ce09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2196.328397] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6737c3b9-d9e6-4879-a6df-46d3c7dee40e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2196.339187] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 40a309bf-6b7f-4360-a083-640db68bb00b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2196.339411] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2196.339556] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2196.473786] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea3ca56e-5ce4-4d5a-a2ae-27761c362d84 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2196.481343] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cb07b45-2a5b-4991-9774-574f50a9e570 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2196.511730] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1563fbe-0bba-4d5b-ae90-fd8e33376030 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2196.518757] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59201d2a-846e-4a4d-835c-fa0e2f77578d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2196.531344] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2196.539814] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2196.552795] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2196.552997] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.311s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2197.552420] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2198.169252] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2198.169452] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 2199.164653] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2200.819671] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "4f26ed27-558d-489a-9141-ec63b6164cc8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2200.819992] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "4f26ed27-558d-489a-9141-ec63b6164cc8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2209.741061] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6a798d6b-fafe-4058-9ff9-1999ee5fd9e1 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Acquiring lock "940561d5-723b-4e43-8fab-35e8af95ce09" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2215.010545] env[62277]: WARNING oslo_vmware.rw_handles [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2215.010545] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2215.010545] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2215.010545] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2215.010545] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2215.010545] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 2215.010545] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2215.010545] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2215.010545] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2215.010545] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2215.010545] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2215.010545] env[62277]: ERROR oslo_vmware.rw_handles [ 2215.011351] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/eab6ee2d-f72a-449e-a310-b5f8309e9112/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2215.013020] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2215.013303] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Copying Virtual Disk [datastore2] vmware_temp/eab6ee2d-f72a-449e-a310-b5f8309e9112/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/eab6ee2d-f72a-449e-a310-b5f8309e9112/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2215.013583] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5dee25a2-4eb2-46c1-a56f-47442a2a6cf9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2215.022385] env[62277]: DEBUG oslo_vmware.api [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Waiting for the task: (returnval){ [ 2215.022385] env[62277]: value = "task-1405497" [ 2215.022385] env[62277]: _type = "Task" [ 2215.022385] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2215.030280] env[62277]: DEBUG oslo_vmware.api [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Task: {'id': task-1405497, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2215.532858] env[62277]: DEBUG oslo_vmware.exceptions [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2215.533146] env[62277]: DEBUG oslo_concurrency.lockutils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2215.533710] env[62277]: ERROR nova.compute.manager [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2215.533710] env[62277]: Faults: ['InvalidArgument'] [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Traceback (most recent call last): [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] yield resources [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] self.driver.spawn(context, instance, image_meta, [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] self._fetch_image_if_missing(context, vi) [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] image_cache(vi, tmp_image_ds_loc) [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] vm_util.copy_virtual_disk( [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] session._wait_for_task(vmdk_copy_task) [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] return self.wait_for_task(task_ref) [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] return evt.wait() [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] result = hub.switch() [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] return self.greenlet.switch() [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] self.f(*self.args, **self.kw) [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] raise exceptions.translate_fault(task_info.error) [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Faults: ['InvalidArgument'] [ 2215.533710] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] [ 2215.534880] env[62277]: INFO nova.compute.manager [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Terminating instance [ 2215.535623] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2215.535748] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2215.535950] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1198cf95-a281-4639-95ac-5b4471923ebf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2215.537976] env[62277]: DEBUG nova.compute.manager [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2215.538178] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2215.538871] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36059b73-d322-4adf-bcc3-562d30392403 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2215.545480] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2215.545711] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b707cda7-cb52-49af-a8a5-6af640947c95 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2215.547661] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2215.547829] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2215.548729] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e67b2855-5844-49d3-96e5-4f45a2e08e3f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2215.553125] env[62277]: DEBUG oslo_vmware.api [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Waiting for the task: (returnval){ [ 2215.553125] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5247d3ef-da92-d3ae-90cb-5b432bd73189" [ 2215.553125] env[62277]: _type = "Task" [ 2215.553125] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2215.560258] env[62277]: DEBUG oslo_vmware.api [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5247d3ef-da92-d3ae-90cb-5b432bd73189, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2215.617502] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2215.617728] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2215.617886] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Deleting the datastore file [datastore2] 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2215.618161] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6c057e7f-eda6-4025-8b25-999d5d02bb52 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2215.624920] env[62277]: DEBUG oslo_vmware.api [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Waiting for the task: (returnval){ [ 2215.624920] env[62277]: value = "task-1405499" [ 2215.624920] env[62277]: _type = "Task" [ 2215.624920] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2215.633324] env[62277]: DEBUG oslo_vmware.api [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Task: {'id': task-1405499, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2216.064054] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2216.064054] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Creating directory with path [datastore2] vmware_temp/77686c09-ee64-47a9-a6f2-1cf1c3716879/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2216.064054] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9d4c3ec0-19fa-4d68-98b2-0c0d82a1f696 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.074689] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Created directory with path [datastore2] vmware_temp/77686c09-ee64-47a9-a6f2-1cf1c3716879/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2216.074877] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Fetch image to [datastore2] vmware_temp/77686c09-ee64-47a9-a6f2-1cf1c3716879/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2216.075050] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/77686c09-ee64-47a9-a6f2-1cf1c3716879/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2216.075751] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-757ef7df-f323-47ac-b018-73432848bb3f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.081821] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a38eba4-6932-444f-991c-743fa6766282 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.090503] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f454c8f4-f9df-4cd0-b450-f5e6ced1f2db {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.121423] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74ac4f7a-200d-4ab4-aaad-773a47692397 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.129353] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ad6edb43-d9cd-400b-b96e-792e1449d634 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.135420] env[62277]: DEBUG oslo_vmware.api [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Task: {'id': task-1405499, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073693} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2216.135635] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2216.135812] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2216.135978] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2216.136169] env[62277]: INFO nova.compute.manager [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2216.138400] env[62277]: DEBUG nova.compute.claims [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2216.138637] env[62277]: DEBUG oslo_concurrency.lockutils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2216.138927] env[62277]: DEBUG oslo_concurrency.lockutils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2216.150355] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2216.202544] env[62277]: DEBUG oslo_vmware.rw_handles [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/77686c09-ee64-47a9-a6f2-1cf1c3716879/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2216.263549] env[62277]: DEBUG oslo_vmware.rw_handles [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2216.263740] env[62277]: DEBUG oslo_vmware.rw_handles [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/77686c09-ee64-47a9-a6f2-1cf1c3716879/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2216.392062] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df084e39-23e9-4510-838d-2b45899e1f41 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.399719] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-139922f7-987f-4898-8ea9-ff57b69a47a0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.428622] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88af7131-f690-4330-8524-ae6beddbfa96 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.435951] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb2ef48d-64f4-4271-96d3-e1d91bd7f901 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2216.448793] env[62277]: DEBUG nova.compute.provider_tree [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2216.457222] env[62277]: DEBUG nova.scheduler.client.report [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2216.471321] env[62277]: DEBUG oslo_concurrency.lockutils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.332s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2216.471861] env[62277]: ERROR nova.compute.manager [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2216.471861] env[62277]: Faults: ['InvalidArgument'] [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Traceback (most recent call last): [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] self.driver.spawn(context, instance, image_meta, [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] self._fetch_image_if_missing(context, vi) [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] image_cache(vi, tmp_image_ds_loc) [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] vm_util.copy_virtual_disk( [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] session._wait_for_task(vmdk_copy_task) [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] return self.wait_for_task(task_ref) [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] return evt.wait() [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] result = hub.switch() [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] return self.greenlet.switch() [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] self.f(*self.args, **self.kw) [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] raise exceptions.translate_fault(task_info.error) [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Faults: ['InvalidArgument'] [ 2216.471861] env[62277]: ERROR nova.compute.manager [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] [ 2216.472863] env[62277]: DEBUG nova.compute.utils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2216.473892] env[62277]: DEBUG nova.compute.manager [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Build of instance 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3 was re-scheduled: A specified parameter was not correct: fileType [ 2216.473892] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2216.474269] env[62277]: DEBUG nova.compute.manager [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2216.474437] env[62277]: DEBUG nova.compute.manager [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2216.474620] env[62277]: DEBUG nova.compute.manager [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2216.474789] env[62277]: DEBUG nova.network.neutron [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2217.003244] env[62277]: DEBUG nova.network.neutron [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2217.017574] env[62277]: INFO nova.compute.manager [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Took 0.54 seconds to deallocate network for instance. [ 2217.113714] env[62277]: INFO nova.scheduler.client.report [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Deleted allocations for instance 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3 [ 2217.135844] env[62277]: DEBUG oslo_concurrency.lockutils [None req-98b94e25-4408-497a-b296-d6830f20b77a tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 684.489s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2217.136950] env[62277]: DEBUG oslo_concurrency.lockutils [None req-048456fb-7e13-412f-a5bf-3a6c3c06d6e3 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 487.768s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2217.137182] env[62277]: DEBUG oslo_concurrency.lockutils [None req-048456fb-7e13-412f-a5bf-3a6c3c06d6e3 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Acquiring lock "35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2217.137385] env[62277]: DEBUG oslo_concurrency.lockutils [None req-048456fb-7e13-412f-a5bf-3a6c3c06d6e3 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2217.137545] env[62277]: DEBUG oslo_concurrency.lockutils [None req-048456fb-7e13-412f-a5bf-3a6c3c06d6e3 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2217.139423] env[62277]: INFO nova.compute.manager [None req-048456fb-7e13-412f-a5bf-3a6c3c06d6e3 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Terminating instance [ 2217.141077] env[62277]: DEBUG nova.compute.manager [None req-048456fb-7e13-412f-a5bf-3a6c3c06d6e3 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2217.141266] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-048456fb-7e13-412f-a5bf-3a6c3c06d6e3 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2217.141717] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ead5a257-3f29-4d17-a0be-d3c032aa1e82 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2217.147662] env[62277]: DEBUG nova.compute.manager [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2217.154293] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd4dd5cc-4050-4b92-a4ae-ec1895e145c0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2217.183271] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-048456fb-7e13-412f-a5bf-3a6c3c06d6e3 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3 could not be found. [ 2217.183549] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-048456fb-7e13-412f-a5bf-3a6c3c06d6e3 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2217.183795] env[62277]: INFO nova.compute.manager [None req-048456fb-7e13-412f-a5bf-3a6c3c06d6e3 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2217.184114] env[62277]: DEBUG oslo.service.loopingcall [None req-048456fb-7e13-412f-a5bf-3a6c3c06d6e3 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2217.186442] env[62277]: DEBUG nova.compute.manager [-] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2217.186593] env[62277]: DEBUG nova.network.neutron [-] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2217.199810] env[62277]: DEBUG oslo_concurrency.lockutils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2217.200056] env[62277]: DEBUG oslo_concurrency.lockutils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2217.201820] env[62277]: INFO nova.compute.claims [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2217.212863] env[62277]: DEBUG nova.network.neutron [-] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2217.227858] env[62277]: INFO nova.compute.manager [-] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] Took 0.04 seconds to deallocate network for instance. [ 2217.312012] env[62277]: DEBUG oslo_concurrency.lockutils [None req-048456fb-7e13-412f-a5bf-3a6c3c06d6e3 tempest-ServerRescueNegativeTestJSON-216449667 tempest-ServerRescueNegativeTestJSON-216449667-project-member] Lock "35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.175s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2217.313061] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 347.186s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2217.313268] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3] During sync_power_state the instance has a pending task (deleting). Skip. [ 2217.313452] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "35a9bfef-3fbf-4b66-aa0a-0651d6a1daf3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2217.383724] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef9a1ba7-8a76-4566-bc6a-c759a5edec6d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2217.391179] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-185b85f2-b76a-4437-8c67-cfa4bde4e54d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2217.421463] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bcc41f7-ada4-4456-adca-6d93d7472cee {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2217.428982] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-408d86fc-11c0-4325-939f-e283ed84a1c5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2217.442063] env[62277]: DEBUG nova.compute.provider_tree [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2217.451857] env[62277]: DEBUG nova.scheduler.client.report [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2217.466321] env[62277]: DEBUG oslo_concurrency.lockutils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.266s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2217.466793] env[62277]: DEBUG nova.compute.manager [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2217.499694] env[62277]: DEBUG nova.compute.utils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2217.501246] env[62277]: DEBUG nova.compute.manager [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2217.501419] env[62277]: DEBUG nova.network.neutron [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2217.509468] env[62277]: DEBUG nova.compute.manager [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2217.572499] env[62277]: DEBUG nova.compute.manager [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2217.588393] env[62277]: DEBUG nova.policy [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '00ed93b61873452bbc15280d2de65bd8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c951cee39d94e49af963590cccf95fb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 2217.608598] env[62277]: DEBUG nova.virt.hardware [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2217.608904] env[62277]: DEBUG nova.virt.hardware [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2217.609311] env[62277]: DEBUG nova.virt.hardware [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2217.609435] env[62277]: DEBUG nova.virt.hardware [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2217.609615] env[62277]: DEBUG nova.virt.hardware [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2217.609809] env[62277]: DEBUG nova.virt.hardware [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2217.610257] env[62277]: DEBUG nova.virt.hardware [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2217.610455] env[62277]: DEBUG nova.virt.hardware [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2217.610726] env[62277]: DEBUG nova.virt.hardware [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2217.610920] env[62277]: DEBUG nova.virt.hardware [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2217.611124] env[62277]: DEBUG nova.virt.hardware [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2217.612296] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62992cd8-9d57-4300-afd0-8e9f83ec842b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2217.620872] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1822d6d-dab1-4d71-bf3e-8a0172b3ecb1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2217.891189] env[62277]: DEBUG nova.network.neutron [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Successfully created port: 09ca48ca-5ded-497d-8bd1-724743181aa0 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2218.675180] env[62277]: DEBUG nova.network.neutron [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Successfully updated port: 09ca48ca-5ded-497d-8bd1-724743181aa0 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2218.691863] env[62277]: DEBUG oslo_concurrency.lockutils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "refresh_cache-6737c3b9-d9e6-4879-a6df-46d3c7dee40e" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2218.692128] env[62277]: DEBUG oslo_concurrency.lockutils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquired lock "refresh_cache-6737c3b9-d9e6-4879-a6df-46d3c7dee40e" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2218.692220] env[62277]: DEBUG nova.network.neutron [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2218.750016] env[62277]: DEBUG nova.network.neutron [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2218.927957] env[62277]: DEBUG nova.network.neutron [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Updating instance_info_cache with network_info: [{"id": "09ca48ca-5ded-497d-8bd1-724743181aa0", "address": "fa:16:3e:b0:ad:5b", "network": {"id": "73ff19f8-2a7e-46b5-ad84-44eb5672dd85", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1025699415-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c951cee39d94e49af963590cccf95fb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "09bf081b-cdf0-4977-abe2-2339a87409ab", "external-id": "nsx-vlan-transportzone-378", "segmentation_id": 378, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap09ca48ca-5d", "ovs_interfaceid": "09ca48ca-5ded-497d-8bd1-724743181aa0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2218.941813] env[62277]: DEBUG oslo_concurrency.lockutils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Releasing lock "refresh_cache-6737c3b9-d9e6-4879-a6df-46d3c7dee40e" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2218.942158] env[62277]: DEBUG nova.compute.manager [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Instance network_info: |[{"id": "09ca48ca-5ded-497d-8bd1-724743181aa0", "address": "fa:16:3e:b0:ad:5b", "network": {"id": "73ff19f8-2a7e-46b5-ad84-44eb5672dd85", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1025699415-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c951cee39d94e49af963590cccf95fb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "09bf081b-cdf0-4977-abe2-2339a87409ab", "external-id": "nsx-vlan-transportzone-378", "segmentation_id": 378, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap09ca48ca-5d", "ovs_interfaceid": "09ca48ca-5ded-497d-8bd1-724743181aa0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2218.943017] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b0:ad:5b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '09bf081b-cdf0-4977-abe2-2339a87409ab', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '09ca48ca-5ded-497d-8bd1-724743181aa0', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2218.950817] env[62277]: DEBUG oslo.service.loopingcall [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2218.951340] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2218.951576] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1b4dc86a-2b0d-406c-98ad-3738b3d30487 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2218.973826] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2218.973826] env[62277]: value = "task-1405500" [ 2218.973826] env[62277]: _type = "Task" [ 2218.973826] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2218.981957] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405500, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2219.047398] env[62277]: DEBUG nova.compute.manager [req-ceaa8bfe-7a99-4438-b0ba-ee604458d709 req-9a6937c1-351c-4d87-92ab-1e44b45b854c service nova] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Received event network-vif-plugged-09ca48ca-5ded-497d-8bd1-724743181aa0 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2219.047398] env[62277]: DEBUG oslo_concurrency.lockutils [req-ceaa8bfe-7a99-4438-b0ba-ee604458d709 req-9a6937c1-351c-4d87-92ab-1e44b45b854c service nova] Acquiring lock "6737c3b9-d9e6-4879-a6df-46d3c7dee40e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2219.047398] env[62277]: DEBUG oslo_concurrency.lockutils [req-ceaa8bfe-7a99-4438-b0ba-ee604458d709 req-9a6937c1-351c-4d87-92ab-1e44b45b854c service nova] Lock "6737c3b9-d9e6-4879-a6df-46d3c7dee40e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2219.047398] env[62277]: DEBUG oslo_concurrency.lockutils [req-ceaa8bfe-7a99-4438-b0ba-ee604458d709 req-9a6937c1-351c-4d87-92ab-1e44b45b854c service nova] Lock "6737c3b9-d9e6-4879-a6df-46d3c7dee40e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2219.047398] env[62277]: DEBUG nova.compute.manager [req-ceaa8bfe-7a99-4438-b0ba-ee604458d709 req-9a6937c1-351c-4d87-92ab-1e44b45b854c service nova] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] No waiting events found dispatching network-vif-plugged-09ca48ca-5ded-497d-8bd1-724743181aa0 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2219.047398] env[62277]: WARNING nova.compute.manager [req-ceaa8bfe-7a99-4438-b0ba-ee604458d709 req-9a6937c1-351c-4d87-92ab-1e44b45b854c service nova] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Received unexpected event network-vif-plugged-09ca48ca-5ded-497d-8bd1-724743181aa0 for instance with vm_state building and task_state spawning. [ 2219.047945] env[62277]: DEBUG nova.compute.manager [req-ceaa8bfe-7a99-4438-b0ba-ee604458d709 req-9a6937c1-351c-4d87-92ab-1e44b45b854c service nova] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Received event network-changed-09ca48ca-5ded-497d-8bd1-724743181aa0 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2219.047945] env[62277]: DEBUG nova.compute.manager [req-ceaa8bfe-7a99-4438-b0ba-ee604458d709 req-9a6937c1-351c-4d87-92ab-1e44b45b854c service nova] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Refreshing instance network info cache due to event network-changed-09ca48ca-5ded-497d-8bd1-724743181aa0. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 2219.047945] env[62277]: DEBUG oslo_concurrency.lockutils [req-ceaa8bfe-7a99-4438-b0ba-ee604458d709 req-9a6937c1-351c-4d87-92ab-1e44b45b854c service nova] Acquiring lock "refresh_cache-6737c3b9-d9e6-4879-a6df-46d3c7dee40e" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2219.048066] env[62277]: DEBUG oslo_concurrency.lockutils [req-ceaa8bfe-7a99-4438-b0ba-ee604458d709 req-9a6937c1-351c-4d87-92ab-1e44b45b854c service nova] Acquired lock "refresh_cache-6737c3b9-d9e6-4879-a6df-46d3c7dee40e" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2219.048131] env[62277]: DEBUG nova.network.neutron [req-ceaa8bfe-7a99-4438-b0ba-ee604458d709 req-9a6937c1-351c-4d87-92ab-1e44b45b854c service nova] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Refreshing network info cache for port 09ca48ca-5ded-497d-8bd1-724743181aa0 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2219.386123] env[62277]: DEBUG nova.network.neutron [req-ceaa8bfe-7a99-4438-b0ba-ee604458d709 req-9a6937c1-351c-4d87-92ab-1e44b45b854c service nova] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Updated VIF entry in instance network info cache for port 09ca48ca-5ded-497d-8bd1-724743181aa0. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2219.386654] env[62277]: DEBUG nova.network.neutron [req-ceaa8bfe-7a99-4438-b0ba-ee604458d709 req-9a6937c1-351c-4d87-92ab-1e44b45b854c service nova] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Updating instance_info_cache with network_info: [{"id": "09ca48ca-5ded-497d-8bd1-724743181aa0", "address": "fa:16:3e:b0:ad:5b", "network": {"id": "73ff19f8-2a7e-46b5-ad84-44eb5672dd85", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1025699415-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c951cee39d94e49af963590cccf95fb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "09bf081b-cdf0-4977-abe2-2339a87409ab", "external-id": "nsx-vlan-transportzone-378", "segmentation_id": 378, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap09ca48ca-5d", "ovs_interfaceid": "09ca48ca-5ded-497d-8bd1-724743181aa0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2219.397706] env[62277]: DEBUG oslo_concurrency.lockutils [req-ceaa8bfe-7a99-4438-b0ba-ee604458d709 req-9a6937c1-351c-4d87-92ab-1e44b45b854c service nova] Releasing lock "refresh_cache-6737c3b9-d9e6-4879-a6df-46d3c7dee40e" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2219.484374] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405500, 'name': CreateVM_Task, 'duration_secs': 0.292307} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2219.484584] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2219.485192] env[62277]: DEBUG oslo_concurrency.lockutils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2219.485353] env[62277]: DEBUG oslo_concurrency.lockutils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2219.485681] env[62277]: DEBUG oslo_concurrency.lockutils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2219.485934] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4dffaf90-9ce9-4a84-a847-ba2491126cc7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2219.490533] env[62277]: DEBUG oslo_vmware.api [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Waiting for the task: (returnval){ [ 2219.490533] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52d136d0-5af6-7146-e3af-d73d995d064d" [ 2219.490533] env[62277]: _type = "Task" [ 2219.490533] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2219.497995] env[62277]: DEBUG oslo_vmware.api [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52d136d0-5af6-7146-e3af-d73d995d064d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2220.001200] env[62277]: DEBUG oslo_concurrency.lockutils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2220.001655] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2220.001655] env[62277]: DEBUG oslo_concurrency.lockutils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2234.796610] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquiring lock "26a7549d-94b4-4113-ab8b-10886eafcd49" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2234.796982] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "26a7549d-94b4-4113-ab8b-10886eafcd49" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2241.674408] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3c48291b-c414-44de-89fc-af67fcfe25e8 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "6737c3b9-d9e6-4879-a6df-46d3c7dee40e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2251.169580] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2253.169038] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2254.163624] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2255.168662] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2256.169408] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2256.169718] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 2256.169718] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 2256.191641] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2256.191810] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2256.191942] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2256.192102] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2256.192229] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2256.192349] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2256.192468] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2256.192585] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 297d53df-7918-4389-9c63-a600755da969] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2256.192710] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2256.192858] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2256.192981] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 2256.193489] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2256.203858] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2256.204069] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2256.204234] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2256.204378] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2256.205457] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a86ca283-3c7a-4dbd-8962-d316259f01bf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.214129] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e708baf-3cf7-4c81-9883-c02aa5808b08 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.227741] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf05e3cc-7d21-40c2-a60e-0fefa603e45c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.233595] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c3a210f-a25e-42a8-a7b4-b2c06e40553c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.261921] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181297MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2256.262071] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2256.262255] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2256.334352] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 163eb4e7-33f8-4674-8a3f-5094356e250d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2256.334501] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 400beb27-a709-4ef4-851e-5caaab9ca60b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2256.334626] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 9f7d4431-d5ea-4f9b-888b-77a6a7772047 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2256.334746] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8b9ef530-e79f-4cd4-8a88-83871ed65f90 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2256.334864] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c4c22c8a-4300-45ce-8484-77c638f7bbc5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2256.335011] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance ad83bf06-d712-4bd4-8086-9c3b615adaf5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2256.335144] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance de543a46-26c3-40b3-9ccd-80bb1f9845d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2256.335257] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 297d53df-7918-4389-9c63-a600755da969 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2256.335369] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 940561d5-723b-4e43-8fab-35e8af95ce09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2256.335479] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6737c3b9-d9e6-4879-a6df-46d3c7dee40e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2256.345497] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4f26ed27-558d-489a-9141-ec63b6164cc8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2256.354873] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 26a7549d-94b4-4113-ab8b-10886eafcd49 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2256.355096] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2256.355240] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2256.489507] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43339f02-9cf0-454a-b8da-df349cfa277a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.497086] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90a53011-eaff-4ce7-b0fd-d960c888d435 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.526519] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-daf8899f-25c9-412a-9af7-54cb559f610a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.533215] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a06fd69-15b7-4e26-986a-9c025d8cd5a3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2256.546188] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2256.554695] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2256.568143] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2256.568331] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.306s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2258.544145] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2258.544600] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2259.168293] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2259.169042] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 2264.539476] env[62277]: WARNING oslo_vmware.rw_handles [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2264.539476] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2264.539476] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2264.539476] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2264.539476] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2264.539476] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 2264.539476] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2264.539476] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2264.539476] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2264.539476] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2264.539476] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2264.539476] env[62277]: ERROR oslo_vmware.rw_handles [ 2264.540193] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/77686c09-ee64-47a9-a6f2-1cf1c3716879/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2264.541922] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2264.542188] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Copying Virtual Disk [datastore2] vmware_temp/77686c09-ee64-47a9-a6f2-1cf1c3716879/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/77686c09-ee64-47a9-a6f2-1cf1c3716879/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2264.542476] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0e861005-86db-4a83-a1ec-1ee6f1d48db0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2264.549953] env[62277]: DEBUG oslo_vmware.api [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Waiting for the task: (returnval){ [ 2264.549953] env[62277]: value = "task-1405501" [ 2264.549953] env[62277]: _type = "Task" [ 2264.549953] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2264.557364] env[62277]: DEBUG oslo_vmware.api [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Task: {'id': task-1405501, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2265.060130] env[62277]: DEBUG oslo_vmware.exceptions [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2265.060417] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2265.060980] env[62277]: ERROR nova.compute.manager [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2265.060980] env[62277]: Faults: ['InvalidArgument'] [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Traceback (most recent call last): [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] yield resources [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] self.driver.spawn(context, instance, image_meta, [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] self._fetch_image_if_missing(context, vi) [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] image_cache(vi, tmp_image_ds_loc) [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] vm_util.copy_virtual_disk( [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] session._wait_for_task(vmdk_copy_task) [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] return self.wait_for_task(task_ref) [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] return evt.wait() [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] result = hub.switch() [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] return self.greenlet.switch() [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] self.f(*self.args, **self.kw) [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] raise exceptions.translate_fault(task_info.error) [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Faults: ['InvalidArgument'] [ 2265.060980] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] [ 2265.062156] env[62277]: INFO nova.compute.manager [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Terminating instance [ 2265.062836] env[62277]: DEBUG oslo_concurrency.lockutils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2265.063034] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2265.063275] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3c3b8983-213f-4994-851d-98c7b990c809 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.068031] env[62277]: DEBUG nova.compute.manager [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2265.068031] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2265.068031] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7813eda3-dfee-44fb-9e42-82e5cff2d8cc {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.074348] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2265.074674] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-384a8a20-f357-47d3-b8ca-4792f7092165 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.076828] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2265.077145] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2265.078139] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-204c458d-37b6-4d7c-854d-b31e75d40f58 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.082664] env[62277]: DEBUG oslo_vmware.api [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Waiting for the task: (returnval){ [ 2265.082664] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52876b8d-d56b-cdea-8e26-6f2867bd6dbb" [ 2265.082664] env[62277]: _type = "Task" [ 2265.082664] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2265.089827] env[62277]: DEBUG oslo_vmware.api [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52876b8d-d56b-cdea-8e26-6f2867bd6dbb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2265.150929] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2265.151419] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2265.151719] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Deleting the datastore file [datastore2] 163eb4e7-33f8-4674-8a3f-5094356e250d {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2265.152095] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-65a26c51-349f-4110-872b-0bda0053fe72 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.158292] env[62277]: DEBUG oslo_vmware.api [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Waiting for the task: (returnval){ [ 2265.158292] env[62277]: value = "task-1405503" [ 2265.158292] env[62277]: _type = "Task" [ 2265.158292] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2265.165547] env[62277]: DEBUG oslo_vmware.api [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Task: {'id': task-1405503, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2265.593454] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2265.593773] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Creating directory with path [datastore2] vmware_temp/690bdf3d-adca-4875-b004-1b625dfc4273/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2265.593946] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-28742273-211a-43c3-a6e1-8418057e32f0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.606365] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Created directory with path [datastore2] vmware_temp/690bdf3d-adca-4875-b004-1b625dfc4273/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2265.606545] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Fetch image to [datastore2] vmware_temp/690bdf3d-adca-4875-b004-1b625dfc4273/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2265.606707] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/690bdf3d-adca-4875-b004-1b625dfc4273/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2265.607446] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95176571-726a-4044-8530-56c02f461cc0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.613771] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b60f1b7-877c-4273-92d7-62522e180136 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.622490] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ada51fcc-f8fd-49a2-bc5e-a59a079f24f1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.651667] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae032a76-3a81-4c69-afe6-1244a6b08d2c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.656870] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1fd0c68e-e3e2-49a0-9116-2bdbefb0c6d5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.667083] env[62277]: DEBUG oslo_vmware.api [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Task: {'id': task-1405503, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.13752} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2265.667419] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2265.667652] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2265.667859] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2265.668102] env[62277]: INFO nova.compute.manager [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2265.670177] env[62277]: DEBUG nova.compute.claims [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2265.670347] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2265.670557] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2265.678430] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2265.729850] env[62277]: DEBUG oslo_vmware.rw_handles [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/690bdf3d-adca-4875-b004-1b625dfc4273/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2265.789119] env[62277]: DEBUG oslo_vmware.rw_handles [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2265.789317] env[62277]: DEBUG oslo_vmware.rw_handles [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/690bdf3d-adca-4875-b004-1b625dfc4273/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2265.893386] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5376c7f-60bf-4fe7-9e2d-e6f94d1ea467 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.900725] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-705aa2c5-8b01-4916-9e8f-5a657e99ed95 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.930456] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d757058-29eb-43c5-b0e0-61915cb0337a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.937221] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bc66ea4-2cc7-4936-9aef-1097027de99d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2265.949884] env[62277]: DEBUG nova.compute.provider_tree [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2265.958670] env[62277]: DEBUG nova.scheduler.client.report [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2265.972960] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.302s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2265.973491] env[62277]: ERROR nova.compute.manager [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2265.973491] env[62277]: Faults: ['InvalidArgument'] [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Traceback (most recent call last): [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] self.driver.spawn(context, instance, image_meta, [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] self._fetch_image_if_missing(context, vi) [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] image_cache(vi, tmp_image_ds_loc) [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] vm_util.copy_virtual_disk( [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] session._wait_for_task(vmdk_copy_task) [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] return self.wait_for_task(task_ref) [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] return evt.wait() [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] result = hub.switch() [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] return self.greenlet.switch() [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] self.f(*self.args, **self.kw) [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] raise exceptions.translate_fault(task_info.error) [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Faults: ['InvalidArgument'] [ 2265.973491] env[62277]: ERROR nova.compute.manager [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] [ 2265.974663] env[62277]: DEBUG nova.compute.utils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2265.975568] env[62277]: DEBUG nova.compute.manager [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Build of instance 163eb4e7-33f8-4674-8a3f-5094356e250d was re-scheduled: A specified parameter was not correct: fileType [ 2265.975568] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2265.975938] env[62277]: DEBUG nova.compute.manager [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2265.976130] env[62277]: DEBUG nova.compute.manager [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2265.976297] env[62277]: DEBUG nova.compute.manager [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2265.976456] env[62277]: DEBUG nova.network.neutron [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2266.465511] env[62277]: DEBUG nova.network.neutron [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2266.480854] env[62277]: INFO nova.compute.manager [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Took 0.50 seconds to deallocate network for instance. [ 2266.580192] env[62277]: INFO nova.scheduler.client.report [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Deleted allocations for instance 163eb4e7-33f8-4674-8a3f-5094356e250d [ 2266.607434] env[62277]: DEBUG oslo_concurrency.lockutils [None req-fb82566c-e0af-444d-9bd2-a9be5d93abd6 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "163eb4e7-33f8-4674-8a3f-5094356e250d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 681.956s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2266.609653] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8b141e2a-510e-45d7-95d9-1549251e1256 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "163eb4e7-33f8-4674-8a3f-5094356e250d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 485.346s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2266.609653] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8b141e2a-510e-45d7-95d9-1549251e1256 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "163eb4e7-33f8-4674-8a3f-5094356e250d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2266.609653] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8b141e2a-510e-45d7-95d9-1549251e1256 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "163eb4e7-33f8-4674-8a3f-5094356e250d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2266.609653] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8b141e2a-510e-45d7-95d9-1549251e1256 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "163eb4e7-33f8-4674-8a3f-5094356e250d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2266.611644] env[62277]: INFO nova.compute.manager [None req-8b141e2a-510e-45d7-95d9-1549251e1256 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Terminating instance [ 2266.613509] env[62277]: DEBUG nova.compute.manager [None req-8b141e2a-510e-45d7-95d9-1549251e1256 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2266.613725] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8b141e2a-510e-45d7-95d9-1549251e1256 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2266.614361] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b9125463-f5fb-4f04-a5c7-2fea03dc4048 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2266.624865] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d1fc0e6-f5cb-436b-a7d6-7dfc49385e1f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2266.636794] env[62277]: DEBUG nova.compute.manager [None req-2d985159-9f30-4671-8956-f52f4b1e3bec tempest-AttachVolumeNegativeTest-184472694 tempest-AttachVolumeNegativeTest-184472694-project-member] [instance: 40a309bf-6b7f-4360-a083-640db68bb00b] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2266.657189] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-8b141e2a-510e-45d7-95d9-1549251e1256 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 163eb4e7-33f8-4674-8a3f-5094356e250d could not be found. [ 2266.657423] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8b141e2a-510e-45d7-95d9-1549251e1256 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2266.657584] env[62277]: INFO nova.compute.manager [None req-8b141e2a-510e-45d7-95d9-1549251e1256 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2266.657830] env[62277]: DEBUG oslo.service.loopingcall [None req-8b141e2a-510e-45d7-95d9-1549251e1256 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2266.658076] env[62277]: DEBUG nova.compute.manager [-] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2266.658173] env[62277]: DEBUG nova.network.neutron [-] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2266.682456] env[62277]: DEBUG nova.network.neutron [-] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2266.685400] env[62277]: DEBUG nova.compute.manager [None req-2d985159-9f30-4671-8956-f52f4b1e3bec tempest-AttachVolumeNegativeTest-184472694 tempest-AttachVolumeNegativeTest-184472694-project-member] [instance: 40a309bf-6b7f-4360-a083-640db68bb00b] Instance disappeared before build. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 2266.690793] env[62277]: INFO nova.compute.manager [-] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] Took 0.03 seconds to deallocate network for instance. [ 2266.707188] env[62277]: DEBUG oslo_concurrency.lockutils [None req-2d985159-9f30-4671-8956-f52f4b1e3bec tempest-AttachVolumeNegativeTest-184472694 tempest-AttachVolumeNegativeTest-184472694-project-member] Lock "40a309bf-6b7f-4360-a083-640db68bb00b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.782s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2266.715664] env[62277]: DEBUG nova.compute.manager [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2266.768540] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2266.768771] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2266.770556] env[62277]: INFO nova.compute.claims [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2266.786797] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8b141e2a-510e-45d7-95d9-1549251e1256 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "163eb4e7-33f8-4674-8a3f-5094356e250d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.178s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2266.787912] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "163eb4e7-33f8-4674-8a3f-5094356e250d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 396.660s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2266.787912] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 163eb4e7-33f8-4674-8a3f-5094356e250d] During sync_power_state the instance has a pending task (deleting). Skip. [ 2266.788081] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "163eb4e7-33f8-4674-8a3f-5094356e250d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2266.944021] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1ee3cf5-1681-4324-ba10-b8b5a1b942da {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2266.951641] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d260b72-76f6-4a43-b002-00faf49c6ab2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2266.980643] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfa13cb6-1472-4e2d-8feb-b8bb9f4edede {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2266.987535] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c12fb74c-f3a8-4d41-9f23-061b052f9950 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2267.000179] env[62277]: DEBUG nova.compute.provider_tree [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2267.008355] env[62277]: DEBUG nova.scheduler.client.report [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2267.021346] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.253s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2267.021792] env[62277]: DEBUG nova.compute.manager [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2267.050499] env[62277]: DEBUG nova.compute.utils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2267.052023] env[62277]: DEBUG nova.compute.manager [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2267.052131] env[62277]: DEBUG nova.network.neutron [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2267.061701] env[62277]: DEBUG nova.compute.manager [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2267.106512] env[62277]: DEBUG nova.policy [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '696edb47b3844d7499217e84fcf42619', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e7e15898bc784416bdc7fa9a9423726f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 2267.122136] env[62277]: DEBUG nova.compute.manager [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2267.147759] env[62277]: DEBUG nova.virt.hardware [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2267.148028] env[62277]: DEBUG nova.virt.hardware [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2267.148180] env[62277]: DEBUG nova.virt.hardware [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2267.148351] env[62277]: DEBUG nova.virt.hardware [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2267.148492] env[62277]: DEBUG nova.virt.hardware [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2267.148635] env[62277]: DEBUG nova.virt.hardware [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2267.148837] env[62277]: DEBUG nova.virt.hardware [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2267.148988] env[62277]: DEBUG nova.virt.hardware [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2267.149166] env[62277]: DEBUG nova.virt.hardware [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2267.149325] env[62277]: DEBUG nova.virt.hardware [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2267.149492] env[62277]: DEBUG nova.virt.hardware [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2267.150469] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9402168-2c7a-4ed6-beb0-7a0792bbbffe {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2267.158595] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42906af9-fb3e-4202-a4f4-8e2ab2aa23be {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2267.392750] env[62277]: DEBUG nova.network.neutron [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Successfully created port: c9065d0f-7eea-427b-9616-e889b4145564 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2268.064127] env[62277]: DEBUG nova.network.neutron [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Successfully updated port: c9065d0f-7eea-427b-9616-e889b4145564 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2268.075697] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "refresh_cache-4f26ed27-558d-489a-9141-ec63b6164cc8" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2268.075851] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquired lock "refresh_cache-4f26ed27-558d-489a-9141-ec63b6164cc8" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2268.076012] env[62277]: DEBUG nova.network.neutron [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2268.125606] env[62277]: DEBUG nova.network.neutron [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2268.274572] env[62277]: DEBUG nova.network.neutron [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Updating instance_info_cache with network_info: [{"id": "c9065d0f-7eea-427b-9616-e889b4145564", "address": "fa:16:3e:9e:1a:e9", "network": {"id": "7efa6c69-4ed6-4615-b77a-53d6e045efc5", "bridge": "br-int", "label": "tempest-ServersTestJSON-132086172-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e7e15898bc784416bdc7fa9a9423726f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b8137fc-f23d-49b1-b19c-3123a5588f34", "external-id": "nsx-vlan-transportzone-709", "segmentation_id": 709, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc9065d0f-7e", "ovs_interfaceid": "c9065d0f-7eea-427b-9616-e889b4145564", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2268.305446] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Releasing lock "refresh_cache-4f26ed27-558d-489a-9141-ec63b6164cc8" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2268.305446] env[62277]: DEBUG nova.compute.manager [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Instance network_info: |[{"id": "c9065d0f-7eea-427b-9616-e889b4145564", "address": "fa:16:3e:9e:1a:e9", "network": {"id": "7efa6c69-4ed6-4615-b77a-53d6e045efc5", "bridge": "br-int", "label": "tempest-ServersTestJSON-132086172-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e7e15898bc784416bdc7fa9a9423726f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b8137fc-f23d-49b1-b19c-3123a5588f34", "external-id": "nsx-vlan-transportzone-709", "segmentation_id": 709, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc9065d0f-7e", "ovs_interfaceid": "c9065d0f-7eea-427b-9616-e889b4145564", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2268.305446] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9e:1a:e9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6b8137fc-f23d-49b1-b19c-3123a5588f34', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c9065d0f-7eea-427b-9616-e889b4145564', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2268.305446] env[62277]: DEBUG oslo.service.loopingcall [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2268.305446] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2268.305446] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-79a3ff9e-8620-4c1b-a461-b160d1113b91 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2268.322760] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2268.322760] env[62277]: value = "task-1405504" [ 2268.322760] env[62277]: _type = "Task" [ 2268.322760] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2268.331104] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405504, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2268.504993] env[62277]: DEBUG nova.compute.manager [req-3d2619d5-a5de-4c74-858b-97ab58a7decc req-236e5c5a-ab18-49c5-880a-6b7c2b0f71ab service nova] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Received event network-vif-plugged-c9065d0f-7eea-427b-9616-e889b4145564 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2268.505258] env[62277]: DEBUG oslo_concurrency.lockutils [req-3d2619d5-a5de-4c74-858b-97ab58a7decc req-236e5c5a-ab18-49c5-880a-6b7c2b0f71ab service nova] Acquiring lock "4f26ed27-558d-489a-9141-ec63b6164cc8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2268.505321] env[62277]: DEBUG oslo_concurrency.lockutils [req-3d2619d5-a5de-4c74-858b-97ab58a7decc req-236e5c5a-ab18-49c5-880a-6b7c2b0f71ab service nova] Lock "4f26ed27-558d-489a-9141-ec63b6164cc8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2268.505490] env[62277]: DEBUG oslo_concurrency.lockutils [req-3d2619d5-a5de-4c74-858b-97ab58a7decc req-236e5c5a-ab18-49c5-880a-6b7c2b0f71ab service nova] Lock "4f26ed27-558d-489a-9141-ec63b6164cc8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2268.505644] env[62277]: DEBUG nova.compute.manager [req-3d2619d5-a5de-4c74-858b-97ab58a7decc req-236e5c5a-ab18-49c5-880a-6b7c2b0f71ab service nova] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] No waiting events found dispatching network-vif-plugged-c9065d0f-7eea-427b-9616-e889b4145564 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2268.505810] env[62277]: WARNING nova.compute.manager [req-3d2619d5-a5de-4c74-858b-97ab58a7decc req-236e5c5a-ab18-49c5-880a-6b7c2b0f71ab service nova] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Received unexpected event network-vif-plugged-c9065d0f-7eea-427b-9616-e889b4145564 for instance with vm_state building and task_state spawning. [ 2268.505964] env[62277]: DEBUG nova.compute.manager [req-3d2619d5-a5de-4c74-858b-97ab58a7decc req-236e5c5a-ab18-49c5-880a-6b7c2b0f71ab service nova] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Received event network-changed-c9065d0f-7eea-427b-9616-e889b4145564 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2268.506134] env[62277]: DEBUG nova.compute.manager [req-3d2619d5-a5de-4c74-858b-97ab58a7decc req-236e5c5a-ab18-49c5-880a-6b7c2b0f71ab service nova] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Refreshing instance network info cache due to event network-changed-c9065d0f-7eea-427b-9616-e889b4145564. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 2268.506313] env[62277]: DEBUG oslo_concurrency.lockutils [req-3d2619d5-a5de-4c74-858b-97ab58a7decc req-236e5c5a-ab18-49c5-880a-6b7c2b0f71ab service nova] Acquiring lock "refresh_cache-4f26ed27-558d-489a-9141-ec63b6164cc8" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2268.506446] env[62277]: DEBUG oslo_concurrency.lockutils [req-3d2619d5-a5de-4c74-858b-97ab58a7decc req-236e5c5a-ab18-49c5-880a-6b7c2b0f71ab service nova] Acquired lock "refresh_cache-4f26ed27-558d-489a-9141-ec63b6164cc8" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2268.506598] env[62277]: DEBUG nova.network.neutron [req-3d2619d5-a5de-4c74-858b-97ab58a7decc req-236e5c5a-ab18-49c5-880a-6b7c2b0f71ab service nova] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Refreshing network info cache for port c9065d0f-7eea-427b-9616-e889b4145564 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2268.786662] env[62277]: DEBUG nova.network.neutron [req-3d2619d5-a5de-4c74-858b-97ab58a7decc req-236e5c5a-ab18-49c5-880a-6b7c2b0f71ab service nova] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Updated VIF entry in instance network info cache for port c9065d0f-7eea-427b-9616-e889b4145564. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2268.787031] env[62277]: DEBUG nova.network.neutron [req-3d2619d5-a5de-4c74-858b-97ab58a7decc req-236e5c5a-ab18-49c5-880a-6b7c2b0f71ab service nova] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Updating instance_info_cache with network_info: [{"id": "c9065d0f-7eea-427b-9616-e889b4145564", "address": "fa:16:3e:9e:1a:e9", "network": {"id": "7efa6c69-4ed6-4615-b77a-53d6e045efc5", "bridge": "br-int", "label": "tempest-ServersTestJSON-132086172-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e7e15898bc784416bdc7fa9a9423726f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b8137fc-f23d-49b1-b19c-3123a5588f34", "external-id": "nsx-vlan-transportzone-709", "segmentation_id": 709, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc9065d0f-7e", "ovs_interfaceid": "c9065d0f-7eea-427b-9616-e889b4145564", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2268.796682] env[62277]: DEBUG oslo_concurrency.lockutils [req-3d2619d5-a5de-4c74-858b-97ab58a7decc req-236e5c5a-ab18-49c5-880a-6b7c2b0f71ab service nova] Releasing lock "refresh_cache-4f26ed27-558d-489a-9141-ec63b6164cc8" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2268.832969] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405504, 'name': CreateVM_Task, 'duration_secs': 0.327838} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2268.833160] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2268.839585] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2268.839746] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2268.840069] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2268.840313] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-626f9b34-f16b-4209-a273-5249b7bb7b8f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2268.844798] env[62277]: DEBUG oslo_vmware.api [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Waiting for the task: (returnval){ [ 2268.844798] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]523f4304-ecb1-f384-e133-53e62d4f2039" [ 2268.844798] env[62277]: _type = "Task" [ 2268.844798] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2268.856361] env[62277]: DEBUG oslo_vmware.api [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]523f4304-ecb1-f384-e133-53e62d4f2039, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2269.355051] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2269.355373] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2269.355517] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2286.477736] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "295cb2fd-b409-4d5c-8fef-12b7acd9fec0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2286.478010] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "295cb2fd-b409-4d5c-8fef-12b7acd9fec0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2312.170887] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2313.168967] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2314.556773] env[62277]: WARNING oslo_vmware.rw_handles [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2314.556773] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2314.556773] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2314.556773] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2314.556773] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2314.556773] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 2314.556773] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2314.556773] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2314.556773] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2314.556773] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2314.556773] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2314.556773] env[62277]: ERROR oslo_vmware.rw_handles [ 2314.556773] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/690bdf3d-adca-4875-b004-1b625dfc4273/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2314.559204] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2314.559537] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Copying Virtual Disk [datastore2] vmware_temp/690bdf3d-adca-4875-b004-1b625dfc4273/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/690bdf3d-adca-4875-b004-1b625dfc4273/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2314.559884] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9195fcc6-ed46-466d-99c8-45b2b6445405 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2314.567974] env[62277]: DEBUG oslo_vmware.api [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Waiting for the task: (returnval){ [ 2314.567974] env[62277]: value = "task-1405505" [ 2314.567974] env[62277]: _type = "Task" [ 2314.567974] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2314.576104] env[62277]: DEBUG oslo_vmware.api [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Task: {'id': task-1405505, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2315.079733] env[62277]: DEBUG oslo_vmware.exceptions [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2315.079733] env[62277]: DEBUG oslo_concurrency.lockutils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2315.079923] env[62277]: ERROR nova.compute.manager [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2315.079923] env[62277]: Faults: ['InvalidArgument'] [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Traceback (most recent call last): [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] yield resources [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] self.driver.spawn(context, instance, image_meta, [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] self._fetch_image_if_missing(context, vi) [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] image_cache(vi, tmp_image_ds_loc) [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] vm_util.copy_virtual_disk( [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] session._wait_for_task(vmdk_copy_task) [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] return self.wait_for_task(task_ref) [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] return evt.wait() [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] result = hub.switch() [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] return self.greenlet.switch() [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] self.f(*self.args, **self.kw) [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] raise exceptions.translate_fault(task_info.error) [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Faults: ['InvalidArgument'] [ 2315.079923] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] [ 2315.080750] env[62277]: INFO nova.compute.manager [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Terminating instance [ 2315.081730] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2315.081929] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2315.082190] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fad64fc6-d347-4d83-8449-ac6ad477baff {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.084333] env[62277]: DEBUG nova.compute.manager [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2315.084541] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2315.085278] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38e3cfed-896b-4a04-9e0a-8c2bfc0ad5fa {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.092579] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2315.093567] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-bbdee8b6-759c-4e2e-be29-c4c7fe1be772 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.094909] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2315.095095] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2315.095766] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2e939d41-aa6c-49ae-ab55-a853395cbc51 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.100491] env[62277]: DEBUG oslo_vmware.api [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Waiting for the task: (returnval){ [ 2315.100491] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]526ac6ac-ff7b-f49f-6ab9-3ab50d390e11" [ 2315.100491] env[62277]: _type = "Task" [ 2315.100491] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2315.107415] env[62277]: DEBUG oslo_vmware.api [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]526ac6ac-ff7b-f49f-6ab9-3ab50d390e11, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2315.164372] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2315.164574] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2315.164752] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Deleting the datastore file [datastore2] 400beb27-a709-4ef4-851e-5caaab9ca60b {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2315.165013] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e6cc0d04-d72f-44d7-835d-7142046e624a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.171512] env[62277]: DEBUG oslo_vmware.api [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Waiting for the task: (returnval){ [ 2315.171512] env[62277]: value = "task-1405507" [ 2315.171512] env[62277]: _type = "Task" [ 2315.171512] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2315.179184] env[62277]: DEBUG oslo_vmware.api [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Task: {'id': task-1405507, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2315.610914] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2315.611261] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Creating directory with path [datastore2] vmware_temp/1130dcf1-8af3-4405-b6de-d2c0a3c195ab/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2315.611363] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9be4338e-6982-41d8-9fbb-e34278353d29 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.622199] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Created directory with path [datastore2] vmware_temp/1130dcf1-8af3-4405-b6de-d2c0a3c195ab/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2315.622383] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Fetch image to [datastore2] vmware_temp/1130dcf1-8af3-4405-b6de-d2c0a3c195ab/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2315.622548] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/1130dcf1-8af3-4405-b6de-d2c0a3c195ab/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2315.623241] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47adce2f-bc42-4fbd-bc8a-e6d6cc828bb0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.629304] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35eee019-a886-4b43-8dfd-4e18aef5a29d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.637959] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-116f6431-c70b-4730-bfc0-416d9cb47a62 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.668372] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-840b259f-1543-474d-9de9-def00a95796d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.676405] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-bacebd00-f244-4279-81a3-2df2a6015673 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.680503] env[62277]: DEBUG oslo_vmware.api [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Task: {'id': task-1405507, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077605} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2315.680976] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2315.681177] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2315.681353] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2315.681521] env[62277]: INFO nova.compute.manager [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2315.683616] env[62277]: DEBUG nova.compute.claims [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2315.683790] env[62277]: DEBUG oslo_concurrency.lockutils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2315.683997] env[62277]: DEBUG oslo_concurrency.lockutils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2315.698661] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2315.833725] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2315.834554] env[62277]: ERROR nova.compute.manager [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 6f125163-af69-40e9-92ae-3b8a01d74b60. [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Traceback (most recent call last): [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] result = getattr(controller, method)(*args, **kwargs) [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return self._get(image_id) [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return RequestIdProxy(wrapped(*args, **kwargs)) [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] resp, body = self.http_client.get(url, headers=header) [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return self.request(url, 'GET', **kwargs) [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return self._handle_response(resp) [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] raise exc.from_response(resp, resp.content) [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] During handling of the above exception, another exception occurred: [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Traceback (most recent call last): [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] yield resources [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] self.driver.spawn(context, instance, image_meta, [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2315.834554] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] self._fetch_image_if_missing(context, vi) [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] image_fetch(context, vi, tmp_image_ds_loc) [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] images.fetch_image( [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] metadata = IMAGE_API.get(context, image_ref) [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return session.show(context, image_id, [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] _reraise_translated_image_exception(image_id) [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] raise new_exc.with_traceback(exc_trace) [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] result = getattr(controller, method)(*args, **kwargs) [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return self._get(image_id) [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return RequestIdProxy(wrapped(*args, **kwargs)) [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] resp, body = self.http_client.get(url, headers=header) [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return self.request(url, 'GET', **kwargs) [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return self._handle_response(resp) [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] raise exc.from_response(resp, resp.content) [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] nova.exception.ImageNotAuthorized: Not authorized for image 6f125163-af69-40e9-92ae-3b8a01d74b60. [ 2315.835458] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] [ 2315.835458] env[62277]: INFO nova.compute.manager [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Terminating instance [ 2315.836357] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2315.836565] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2315.837083] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Acquiring lock "refresh_cache-9f7d4431-d5ea-4f9b-888b-77a6a7772047" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2315.837236] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Acquired lock "refresh_cache-9f7d4431-d5ea-4f9b-888b-77a6a7772047" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2315.837399] env[62277]: DEBUG nova.network.neutron [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2315.840198] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b4cda804-9ab9-4472-950d-6b511fb716d4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.849026] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2315.849148] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2315.849868] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-440c9dfb-5830-4341-9f9e-9840ad1f08e3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.857026] env[62277]: DEBUG oslo_vmware.api [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Waiting for the task: (returnval){ [ 2315.857026] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5215ffb3-c7b0-c10b-67ed-f2e326b6dab1" [ 2315.857026] env[62277]: _type = "Task" [ 2315.857026] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2315.864414] env[62277]: DEBUG oslo_vmware.api [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5215ffb3-c7b0-c10b-67ed-f2e326b6dab1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2315.886428] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03d72966-b356-44c9-b14d-6a6d7d8e5e99 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.889339] env[62277]: DEBUG nova.network.neutron [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2315.895282] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-314ba206-80b0-468c-9bc1-217df2e07140 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.927126] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72431d61-7ccb-4000-8a91-4ae7d523df86 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.934091] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efd3a8cd-fa34-4262-86cb-c9557b8ebdef {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.946973] env[62277]: DEBUG nova.compute.provider_tree [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2315.955601] env[62277]: DEBUG nova.scheduler.client.report [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2315.970979] env[62277]: DEBUG oslo_concurrency.lockutils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.287s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2315.971618] env[62277]: ERROR nova.compute.manager [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2315.971618] env[62277]: Faults: ['InvalidArgument'] [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Traceback (most recent call last): [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] self.driver.spawn(context, instance, image_meta, [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] self._fetch_image_if_missing(context, vi) [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] image_cache(vi, tmp_image_ds_loc) [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] vm_util.copy_virtual_disk( [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] session._wait_for_task(vmdk_copy_task) [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] return self.wait_for_task(task_ref) [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] return evt.wait() [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] result = hub.switch() [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] return self.greenlet.switch() [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] self.f(*self.args, **self.kw) [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] raise exceptions.translate_fault(task_info.error) [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Faults: ['InvalidArgument'] [ 2315.971618] env[62277]: ERROR nova.compute.manager [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] [ 2315.972451] env[62277]: DEBUG nova.compute.utils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2315.973732] env[62277]: DEBUG nova.compute.manager [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Build of instance 400beb27-a709-4ef4-851e-5caaab9ca60b was re-scheduled: A specified parameter was not correct: fileType [ 2315.973732] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2315.974108] env[62277]: DEBUG nova.compute.manager [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2315.974277] env[62277]: DEBUG nova.compute.manager [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2315.974465] env[62277]: DEBUG nova.compute.manager [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2315.974650] env[62277]: DEBUG nova.network.neutron [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2315.977066] env[62277]: DEBUG nova.network.neutron [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2315.984398] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Releasing lock "refresh_cache-9f7d4431-d5ea-4f9b-888b-77a6a7772047" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2315.984817] env[62277]: DEBUG nova.compute.manager [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2315.986057] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2315.986308] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-179540a0-6787-4d63-bc4b-16bcb22ae5b1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2315.993317] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2315.993768] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-021e65c7-01d7-42a8-8f47-b944e03489cf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.018654] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2316.018857] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2316.019036] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Deleting the datastore file [datastore2] 9f7d4431-d5ea-4f9b-888b-77a6a7772047 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2316.019533] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-efc0bbf3-988e-4785-86f6-52f0141ebf12 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.025161] env[62277]: DEBUG oslo_vmware.api [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Waiting for the task: (returnval){ [ 2316.025161] env[62277]: value = "task-1405509" [ 2316.025161] env[62277]: _type = "Task" [ 2316.025161] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2316.032339] env[62277]: DEBUG oslo_vmware.api [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Task: {'id': task-1405509, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2316.163948] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2316.167713] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2316.167922] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 2316.168096] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 2316.189850] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2316.190164] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2316.190164] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2316.190266] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2316.190382] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2316.190503] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 297d53df-7918-4389-9c63-a600755da969] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2316.190623] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2316.190741] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2316.190885] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2316.191012] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 2316.191514] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2316.265952] env[62277]: DEBUG nova.network.neutron [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2316.277678] env[62277]: INFO nova.compute.manager [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Took 0.30 seconds to deallocate network for instance. [ 2316.369094] env[62277]: INFO nova.scheduler.client.report [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Deleted allocations for instance 400beb27-a709-4ef4-851e-5caaab9ca60b [ 2316.374537] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2316.374799] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Creating directory with path [datastore2] vmware_temp/0dde662e-ad74-4745-ac67-87f037ae3caa/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2316.375057] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a6bdb874-3605-4352-9cdd-02c59eefbd27 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.389051] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Created directory with path [datastore2] vmware_temp/0dde662e-ad74-4745-ac67-87f037ae3caa/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2316.389051] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Fetch image to [datastore2] vmware_temp/0dde662e-ad74-4745-ac67-87f037ae3caa/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2316.389051] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/0dde662e-ad74-4745-ac67-87f037ae3caa/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2316.389697] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb27ef11-1a59-4c6f-97de-ed717aaf0005 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.398025] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b60f4b55-fccd-4c80-9338-019faa7c2318 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.400423] env[62277]: DEBUG oslo_concurrency.lockutils [None req-edf054f3-aec1-42b9-aa54-272a6f5f6754 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "400beb27-a709-4ef4-851e-5caaab9ca60b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 663.923s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2316.402478] env[62277]: DEBUG oslo_concurrency.lockutils [None req-88bc2231-abac-4b6e-82e1-07919e9ebcd4 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "400beb27-a709-4ef4-851e-5caaab9ca60b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 468.107s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2316.402885] env[62277]: DEBUG oslo_concurrency.lockutils [None req-88bc2231-abac-4b6e-82e1-07919e9ebcd4 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "400beb27-a709-4ef4-851e-5caaab9ca60b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2316.403222] env[62277]: DEBUG oslo_concurrency.lockutils [None req-88bc2231-abac-4b6e-82e1-07919e9ebcd4 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "400beb27-a709-4ef4-851e-5caaab9ca60b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2316.403558] env[62277]: DEBUG oslo_concurrency.lockutils [None req-88bc2231-abac-4b6e-82e1-07919e9ebcd4 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "400beb27-a709-4ef4-851e-5caaab9ca60b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2316.412133] env[62277]: INFO nova.compute.manager [None req-88bc2231-abac-4b6e-82e1-07919e9ebcd4 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Terminating instance [ 2316.413685] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f69edcad-ec20-495e-a8a6-aef0c544128d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.419396] env[62277]: DEBUG nova.compute.manager [None req-88bc2231-abac-4b6e-82e1-07919e9ebcd4 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2316.419587] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-88bc2231-abac-4b6e-82e1-07919e9ebcd4 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2316.420323] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0e509633-ee34-4eaf-b211-b72d51c502a6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.422078] env[62277]: DEBUG nova.compute.manager [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2316.453132] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-348f2d56-2afc-4c06-ab54-30b73089d3ef {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.458448] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e486d44-e37a-47a1-b447-a59c9e29b997 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.475490] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ebe53836-7d2c-49c8-9d0a-7a5002930965 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.489490] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-88bc2231-abac-4b6e-82e1-07919e9ebcd4 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 400beb27-a709-4ef4-851e-5caaab9ca60b could not be found. [ 2316.489705] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-88bc2231-abac-4b6e-82e1-07919e9ebcd4 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2316.489878] env[62277]: INFO nova.compute.manager [None req-88bc2231-abac-4b6e-82e1-07919e9ebcd4 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Took 0.07 seconds to destroy the instance on the hypervisor. [ 2316.490129] env[62277]: DEBUG oslo.service.loopingcall [None req-88bc2231-abac-4b6e-82e1-07919e9ebcd4 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2316.491091] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2316.491325] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2316.492908] env[62277]: INFO nova.compute.claims [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2316.495364] env[62277]: DEBUG nova.compute.manager [-] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2316.495466] env[62277]: DEBUG nova.network.neutron [-] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2316.498791] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2316.530332] env[62277]: DEBUG nova.network.neutron [-] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2316.538239] env[62277]: DEBUG oslo_vmware.api [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Task: {'id': task-1405509, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.03704} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2316.540111] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2316.540310] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2316.540480] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2316.540649] env[62277]: INFO nova.compute.manager [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Took 0.56 seconds to destroy the instance on the hypervisor. [ 2316.540882] env[62277]: DEBUG oslo.service.loopingcall [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2316.541473] env[62277]: DEBUG nova.compute.manager [-] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Skipping network deallocation for instance since networking was not requested. {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2316.543646] env[62277]: INFO nova.compute.manager [-] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] Took 0.05 seconds to deallocate network for instance. [ 2316.546320] env[62277]: DEBUG nova.compute.claims [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2316.546488] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2316.553396] env[62277]: DEBUG oslo_vmware.rw_handles [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0dde662e-ad74-4745-ac67-87f037ae3caa/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2316.617396] env[62277]: DEBUG oslo_vmware.rw_handles [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2316.617659] env[62277]: DEBUG oslo_vmware.rw_handles [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0dde662e-ad74-4745-ac67-87f037ae3caa/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2316.676516] env[62277]: DEBUG oslo_concurrency.lockutils [None req-88bc2231-abac-4b6e-82e1-07919e9ebcd4 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "400beb27-a709-4ef4-851e-5caaab9ca60b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.274s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2316.677438] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "400beb27-a709-4ef4-851e-5caaab9ca60b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 446.550s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2316.677622] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 400beb27-a709-4ef4-851e-5caaab9ca60b] During sync_power_state the instance has a pending task (deleting). Skip. [ 2316.677792] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "400beb27-a709-4ef4-851e-5caaab9ca60b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2316.733298] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3ed03a3-ab08-4e86-a33c-a73174bdf0c1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.740895] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1d13b35-108c-4ef3-9137-884e9e22a327 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.769856] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1dc61df-54f9-48dd-994d-22ede57df184 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.776953] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94d0d911-13aa-4dc4-b501-2bf5f93aa5d5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.790595] env[62277]: DEBUG nova.compute.provider_tree [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2316.799458] env[62277]: DEBUG nova.scheduler.client.report [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2316.812423] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.321s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2316.812890] env[62277]: DEBUG nova.compute.manager [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2316.815312] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.269s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2316.856475] env[62277]: DEBUG nova.compute.utils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2316.861048] env[62277]: DEBUG nova.compute.manager [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2316.861048] env[62277]: DEBUG nova.network.neutron [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2316.866640] env[62277]: DEBUG nova.compute.manager [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2316.925932] env[62277]: DEBUG nova.policy [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4600f6c9a0554b8a8077a3977337bfde', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bd7e0cacdaeb4e6e80d603d41978a23f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 2316.938884] env[62277]: DEBUG nova.compute.manager [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2316.966805] env[62277]: DEBUG nova.virt.hardware [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2316.967110] env[62277]: DEBUG nova.virt.hardware [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2316.967281] env[62277]: DEBUG nova.virt.hardware [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2316.967454] env[62277]: DEBUG nova.virt.hardware [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2316.967598] env[62277]: DEBUG nova.virt.hardware [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2316.967740] env[62277]: DEBUG nova.virt.hardware [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2316.967942] env[62277]: DEBUG nova.virt.hardware [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2316.968108] env[62277]: DEBUG nova.virt.hardware [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2316.968270] env[62277]: DEBUG nova.virt.hardware [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2316.968425] env[62277]: DEBUG nova.virt.hardware [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2316.968629] env[62277]: DEBUG nova.virt.hardware [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2316.969760] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-205aae14-2ef1-4073-8ae3-55669e7fe744 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2316.981585] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b362f3f4-e8e0-4667-8f6f-d2939bd91519 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.011017] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4065b7ad-61ec-438d-bdd8-7528bd9140f9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.016082] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e20e6e9-ebc5-49ed-9439-7fd2706da83e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.045692] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-160a843a-0559-4a39-bc77-ec5815be4ae7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.052830] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b102b611-6f03-443c-87e4-896982c904fa {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.065743] env[62277]: DEBUG nova.compute.provider_tree [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2317.074318] env[62277]: DEBUG nova.scheduler.client.report [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2317.087802] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.272s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2317.088710] env[62277]: ERROR nova.compute.manager [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image 6f125163-af69-40e9-92ae-3b8a01d74b60. [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Traceback (most recent call last): [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] result = getattr(controller, method)(*args, **kwargs) [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return self._get(image_id) [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return RequestIdProxy(wrapped(*args, **kwargs)) [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] resp, body = self.http_client.get(url, headers=header) [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return self.request(url, 'GET', **kwargs) [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return self._handle_response(resp) [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] raise exc.from_response(resp, resp.content) [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] During handling of the above exception, another exception occurred: [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Traceback (most recent call last): [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] self.driver.spawn(context, instance, image_meta, [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] self._fetch_image_if_missing(context, vi) [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 2317.088710] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] image_fetch(context, vi, tmp_image_ds_loc) [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] images.fetch_image( [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] metadata = IMAGE_API.get(context, image_ref) [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return session.show(context, image_id, [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] _reraise_translated_image_exception(image_id) [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] raise new_exc.with_traceback(exc_trace) [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] result = getattr(controller, method)(*args, **kwargs) [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return self._get(image_id) [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return RequestIdProxy(wrapped(*args, **kwargs)) [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] resp, body = self.http_client.get(url, headers=header) [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return self.request(url, 'GET', **kwargs) [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return self._handle_response(resp) [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] raise exc.from_response(resp, resp.content) [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] nova.exception.ImageNotAuthorized: Not authorized for image 6f125163-af69-40e9-92ae-3b8a01d74b60. [ 2317.089482] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] [ 2317.090042] env[62277]: DEBUG nova.compute.utils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Not authorized for image 6f125163-af69-40e9-92ae-3b8a01d74b60. {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2317.091708] env[62277]: DEBUG nova.compute.manager [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Build of instance 9f7d4431-d5ea-4f9b-888b-77a6a7772047 was re-scheduled: Not authorized for image 6f125163-af69-40e9-92ae-3b8a01d74b60. {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2317.092207] env[62277]: DEBUG nova.compute.manager [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2317.092439] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Acquiring lock "refresh_cache-9f7d4431-d5ea-4f9b-888b-77a6a7772047" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2317.092596] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Acquired lock "refresh_cache-9f7d4431-d5ea-4f9b-888b-77a6a7772047" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2317.092755] env[62277]: DEBUG nova.network.neutron [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2317.117022] env[62277]: DEBUG nova.network.neutron [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2317.168710] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2317.177187] env[62277]: DEBUG nova.network.neutron [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2317.180184] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2317.180382] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2317.180573] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2317.180741] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2317.181835] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e56e9861-3825-4e4b-8837-03f22755e342 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.185511] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Releasing lock "refresh_cache-9f7d4431-d5ea-4f9b-888b-77a6a7772047" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2317.185708] env[62277]: DEBUG nova.compute.manager [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2317.185884] env[62277]: DEBUG nova.compute.manager [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Skipping network deallocation for instance since networking was not requested. {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2317.193494] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84602355-b559-49f5-8fe7-f40d6c203e3e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.209049] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4356e0b-eb7e-4c84-bd07-fbaf16a4aa17 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.215399] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c57b9040-e4e8-48a9-af96-d455bf053aba {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.245116] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181387MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2317.246085] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2317.246085] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2317.256553] env[62277]: DEBUG nova.network.neutron [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Successfully created port: 90470732-b8ad-48c6-8132-acf6f0dded11 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2317.321462] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 8b9ef530-e79f-4cd4-8a88-83871ed65f90 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2317.321713] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c4c22c8a-4300-45ce-8484-77c638f7bbc5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2317.321881] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance ad83bf06-d712-4bd4-8086-9c3b615adaf5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2317.322048] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance de543a46-26c3-40b3-9ccd-80bb1f9845d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2317.322205] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 297d53df-7918-4389-9c63-a600755da969 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2317.322392] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 940561d5-723b-4e43-8fab-35e8af95ce09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2317.322570] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6737c3b9-d9e6-4879-a6df-46d3c7dee40e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2317.322725] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4f26ed27-558d-489a-9141-ec63b6164cc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2317.322874] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 26a7549d-94b4-4113-ab8b-10886eafcd49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2317.325032] env[62277]: INFO nova.scheduler.client.report [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Deleted allocations for instance 9f7d4431-d5ea-4f9b-888b-77a6a7772047 [ 2317.344495] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 295cb2fd-b409-4d5c-8fef-12b7acd9fec0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2317.344495] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2317.344495] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2317.347228] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf0b4063-661f-40ec-9fb2-575b4072aba5 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Lock "9f7d4431-d5ea-4f9b-888b-77a6a7772047" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 633.508s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2317.348881] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Lock "9f7d4431-d5ea-4f9b-888b-77a6a7772047" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 438.109s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2317.348881] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Acquiring lock "9f7d4431-d5ea-4f9b-888b-77a6a7772047-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2317.348881] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Lock "9f7d4431-d5ea-4f9b-888b-77a6a7772047-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2317.349071] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Lock "9f7d4431-d5ea-4f9b-888b-77a6a7772047-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2317.350856] env[62277]: INFO nova.compute.manager [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Terminating instance [ 2317.355022] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Acquiring lock "refresh_cache-9f7d4431-d5ea-4f9b-888b-77a6a7772047" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2317.355022] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Acquired lock "refresh_cache-9f7d4431-d5ea-4f9b-888b-77a6a7772047" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2317.355022] env[62277]: DEBUG nova.network.neutron [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2317.366647] env[62277]: DEBUG nova.compute.manager [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2317.388685] env[62277]: DEBUG nova.network.neutron [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2317.432830] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2317.462931] env[62277]: DEBUG nova.network.neutron [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2317.476354] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Releasing lock "refresh_cache-9f7d4431-d5ea-4f9b-888b-77a6a7772047" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2317.476805] env[62277]: DEBUG nova.compute.manager [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2317.476961] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2317.479748] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-646acf5c-bcc8-476b-97d8-16cf3d980a30 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.489048] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-465b92ec-95b6-4e0d-b187-92398b0ec2bc {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.520049] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9f7d4431-d5ea-4f9b-888b-77a6a7772047 could not be found. [ 2317.520269] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2317.520444] env[62277]: INFO nova.compute.manager [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2317.520688] env[62277]: DEBUG oslo.service.loopingcall [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2317.523233] env[62277]: DEBUG nova.compute.manager [-] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2317.523349] env[62277]: DEBUG nova.network.neutron [-] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2317.539760] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9bd951d-1c57-4579-8c76-22fca87c864c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.547356] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78aad51b-cf59-4536-a829-10924219428e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.578796] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09fabe70-e693-4692-80f8-eb7e56ed012d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.587064] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a1d844c-ce3d-4d75-8967-4f2b98d2f425 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.600970] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2317.611323] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2317.629891] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2317.630163] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.385s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2317.630465] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.198s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2317.632311] env[62277]: INFO nova.compute.claims [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2317.685920] env[62277]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=62277) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 2317.686188] env[62277]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-13b8f4e9-7cfd-448a-b676-6db01de7f322'] [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall self._deallocate_network( [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 2317.686713] env[62277]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 2317.688182] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 2317.688182] env[62277]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 2317.688182] env[62277]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2317.688182] env[62277]: ERROR oslo.service.loopingcall [ 2317.688182] env[62277]: ERROR nova.compute.manager [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2317.720094] env[62277]: ERROR nova.compute.manager [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Traceback (most recent call last): [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] ret = obj(*args, **kwargs) [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] exception_handler_v20(status_code, error_body) [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] raise client_exc(message=error_message, [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Neutron server returns request_ids: ['req-13b8f4e9-7cfd-448a-b676-6db01de7f322'] [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] During handling of the above exception, another exception occurred: [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Traceback (most recent call last): [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] self._delete_instance(context, instance, bdms) [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] self._shutdown_instance(context, instance, bdms) [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] self._try_deallocate_network(context, instance, requested_networks) [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] with excutils.save_and_reraise_exception(): [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] self.force_reraise() [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] raise self.value [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] _deallocate_network_with_retries() [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return evt.wait() [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2317.720094] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] result = hub.switch() [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return self.greenlet.switch() [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] result = func(*self.args, **self.kw) [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] result = f(*args, **kwargs) [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] self._deallocate_network( [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] self.network_api.deallocate_for_instance( [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] data = neutron.list_ports(**search_opts) [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] ret = obj(*args, **kwargs) [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return self.list('ports', self.ports_path, retrieve_all, [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] ret = obj(*args, **kwargs) [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] for r in self._pagination(collection, path, **params): [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] res = self.get(path, params=params) [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] ret = obj(*args, **kwargs) [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return self.retry_request("GET", action, body=body, [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] ret = obj(*args, **kwargs) [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] return self.do_request(method, action, body=body, [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] ret = obj(*args, **kwargs) [ 2317.720821] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 2317.721713] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] self._handle_fault_response(status_code, replybody, resp) [ 2317.721713] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 2317.721713] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 2317.721713] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2317.721713] env[62277]: ERROR nova.compute.manager [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] [ 2317.748912] env[62277]: DEBUG oslo_concurrency.lockutils [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Lock "9f7d4431-d5ea-4f9b-888b-77a6a7772047" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.400s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2317.806416] env[62277]: INFO nova.compute.manager [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] [instance: 9f7d4431-d5ea-4f9b-888b-77a6a7772047] Successfully reverted task state from None on failure for instance. [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server [None req-1c837b54-288e-4369-b0ed-3ab98b6f7f71 tempest-ServersAaction247Test-2025697911 tempest-ServersAaction247Test-2025697911-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-13b8f4e9-7cfd-448a-b676-6db01de7f322'] [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 2317.810474] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server return evt.wait() [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2317.811537] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2317.812552] env[62277]: ERROR oslo_messaging.rpc.server [ 2317.823956] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78a3108e-0b68-4ec8-831d-4c6429e5bffe {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.831540] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00f6d735-6d76-4562-a439-73da37b245e5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.861257] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f991684f-eb6b-4997-87b5-b5d4bdead8a7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.868740] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c235157-c7b5-4cab-9007-9c20347b26eb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2317.883345] env[62277]: DEBUG nova.compute.provider_tree [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2317.891456] env[62277]: DEBUG nova.scheduler.client.report [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2317.906980] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.276s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2317.907494] env[62277]: DEBUG nova.compute.manager [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2317.943159] env[62277]: DEBUG nova.compute.utils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2317.944592] env[62277]: DEBUG nova.compute.manager [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2317.944592] env[62277]: DEBUG nova.network.neutron [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2317.953042] env[62277]: DEBUG nova.compute.manager [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2317.978918] env[62277]: DEBUG nova.network.neutron [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Successfully updated port: 90470732-b8ad-48c6-8132-acf6f0dded11 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2317.991210] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquiring lock "refresh_cache-26a7549d-94b4-4113-ab8b-10886eafcd49" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2317.991294] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquired lock "refresh_cache-26a7549d-94b4-4113-ab8b-10886eafcd49" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2317.991416] env[62277]: DEBUG nova.network.neutron [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2318.026042] env[62277]: DEBUG nova.compute.manager [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2318.033926] env[62277]: DEBUG nova.network.neutron [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2318.037042] env[62277]: DEBUG nova.policy [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '013359a6ab0644799bb338125a970c37', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '47f21dc2b2ad4fe692324779a4a84760', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 2318.052353] env[62277]: DEBUG nova.virt.hardware [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2318.052589] env[62277]: DEBUG nova.virt.hardware [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2318.052740] env[62277]: DEBUG nova.virt.hardware [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2318.052914] env[62277]: DEBUG nova.virt.hardware [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2318.053068] env[62277]: DEBUG nova.virt.hardware [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2318.053216] env[62277]: DEBUG nova.virt.hardware [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2318.053416] env[62277]: DEBUG nova.virt.hardware [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2318.053600] env[62277]: DEBUG nova.virt.hardware [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2318.053765] env[62277]: DEBUG nova.virt.hardware [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2318.053922] env[62277]: DEBUG nova.virt.hardware [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2318.054101] env[62277]: DEBUG nova.virt.hardware [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2318.055020] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db468f51-880d-4c8a-9937-c46e1541ed7b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2318.065158] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d74dd433-b696-4750-8265-2cc746e586d7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2318.198330] env[62277]: DEBUG nova.network.neutron [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Updating instance_info_cache with network_info: [{"id": "90470732-b8ad-48c6-8132-acf6f0dded11", "address": "fa:16:3e:47:cd:8e", "network": {"id": "f0d2c639-a921-4764-8b8a-3590b3b7d7f3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1936428655-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bd7e0cacdaeb4e6e80d603d41978a23f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b7bf7d4-8e0c-4cee-84ba-244e73ef6379", "external-id": "nsx-vlan-transportzone-423", "segmentation_id": 423, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap90470732-b8", "ovs_interfaceid": "90470732-b8ad-48c6-8132-acf6f0dded11", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2318.210896] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Releasing lock "refresh_cache-26a7549d-94b4-4113-ab8b-10886eafcd49" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2318.211197] env[62277]: DEBUG nova.compute.manager [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Instance network_info: |[{"id": "90470732-b8ad-48c6-8132-acf6f0dded11", "address": "fa:16:3e:47:cd:8e", "network": {"id": "f0d2c639-a921-4764-8b8a-3590b3b7d7f3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1936428655-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bd7e0cacdaeb4e6e80d603d41978a23f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b7bf7d4-8e0c-4cee-84ba-244e73ef6379", "external-id": "nsx-vlan-transportzone-423", "segmentation_id": 423, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap90470732-b8", "ovs_interfaceid": "90470732-b8ad-48c6-8132-acf6f0dded11", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2318.211616] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:47:cd:8e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3b7bf7d4-8e0c-4cee-84ba-244e73ef6379', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '90470732-b8ad-48c6-8132-acf6f0dded11', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2318.219506] env[62277]: DEBUG oslo.service.loopingcall [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2318.219968] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2318.220204] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c2f31bb3-e520-41af-afa9-97ebef62d06b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2318.240866] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2318.240866] env[62277]: value = "task-1405510" [ 2318.240866] env[62277]: _type = "Task" [ 2318.240866] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2318.249024] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405510, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2318.327908] env[62277]: DEBUG nova.network.neutron [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Successfully created port: 0e37cb63-3d96-442e-ad18-48d104e91e08 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2318.338696] env[62277]: DEBUG nova.compute.manager [req-2ad1e0ac-3965-4034-b48f-2c3bba56f575 req-8a5ee678-9a30-4068-84c5-7276d137d5a3 service nova] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Received event network-vif-plugged-90470732-b8ad-48c6-8132-acf6f0dded11 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2318.338922] env[62277]: DEBUG oslo_concurrency.lockutils [req-2ad1e0ac-3965-4034-b48f-2c3bba56f575 req-8a5ee678-9a30-4068-84c5-7276d137d5a3 service nova] Acquiring lock "26a7549d-94b4-4113-ab8b-10886eafcd49-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2318.339145] env[62277]: DEBUG oslo_concurrency.lockutils [req-2ad1e0ac-3965-4034-b48f-2c3bba56f575 req-8a5ee678-9a30-4068-84c5-7276d137d5a3 service nova] Lock "26a7549d-94b4-4113-ab8b-10886eafcd49-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2318.339316] env[62277]: DEBUG oslo_concurrency.lockutils [req-2ad1e0ac-3965-4034-b48f-2c3bba56f575 req-8a5ee678-9a30-4068-84c5-7276d137d5a3 service nova] Lock "26a7549d-94b4-4113-ab8b-10886eafcd49-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2318.340040] env[62277]: DEBUG nova.compute.manager [req-2ad1e0ac-3965-4034-b48f-2c3bba56f575 req-8a5ee678-9a30-4068-84c5-7276d137d5a3 service nova] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] No waiting events found dispatching network-vif-plugged-90470732-b8ad-48c6-8132-acf6f0dded11 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2318.340040] env[62277]: WARNING nova.compute.manager [req-2ad1e0ac-3965-4034-b48f-2c3bba56f575 req-8a5ee678-9a30-4068-84c5-7276d137d5a3 service nova] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Received unexpected event network-vif-plugged-90470732-b8ad-48c6-8132-acf6f0dded11 for instance with vm_state building and task_state spawning. [ 2318.340040] env[62277]: DEBUG nova.compute.manager [req-2ad1e0ac-3965-4034-b48f-2c3bba56f575 req-8a5ee678-9a30-4068-84c5-7276d137d5a3 service nova] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Received event network-changed-90470732-b8ad-48c6-8132-acf6f0dded11 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2318.340040] env[62277]: DEBUG nova.compute.manager [req-2ad1e0ac-3965-4034-b48f-2c3bba56f575 req-8a5ee678-9a30-4068-84c5-7276d137d5a3 service nova] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Refreshing instance network info cache due to event network-changed-90470732-b8ad-48c6-8132-acf6f0dded11. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 2318.340512] env[62277]: DEBUG oslo_concurrency.lockutils [req-2ad1e0ac-3965-4034-b48f-2c3bba56f575 req-8a5ee678-9a30-4068-84c5-7276d137d5a3 service nova] Acquiring lock "refresh_cache-26a7549d-94b4-4113-ab8b-10886eafcd49" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2318.340512] env[62277]: DEBUG oslo_concurrency.lockutils [req-2ad1e0ac-3965-4034-b48f-2c3bba56f575 req-8a5ee678-9a30-4068-84c5-7276d137d5a3 service nova] Acquired lock "refresh_cache-26a7549d-94b4-4113-ab8b-10886eafcd49" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2318.340825] env[62277]: DEBUG nova.network.neutron [req-2ad1e0ac-3965-4034-b48f-2c3bba56f575 req-8a5ee678-9a30-4068-84c5-7276d137d5a3 service nova] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Refreshing network info cache for port 90470732-b8ad-48c6-8132-acf6f0dded11 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2318.601237] env[62277]: DEBUG nova.network.neutron [req-2ad1e0ac-3965-4034-b48f-2c3bba56f575 req-8a5ee678-9a30-4068-84c5-7276d137d5a3 service nova] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Updated VIF entry in instance network info cache for port 90470732-b8ad-48c6-8132-acf6f0dded11. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2318.601623] env[62277]: DEBUG nova.network.neutron [req-2ad1e0ac-3965-4034-b48f-2c3bba56f575 req-8a5ee678-9a30-4068-84c5-7276d137d5a3 service nova] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Updating instance_info_cache with network_info: [{"id": "90470732-b8ad-48c6-8132-acf6f0dded11", "address": "fa:16:3e:47:cd:8e", "network": {"id": "f0d2c639-a921-4764-8b8a-3590b3b7d7f3", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1936428655-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bd7e0cacdaeb4e6e80d603d41978a23f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b7bf7d4-8e0c-4cee-84ba-244e73ef6379", "external-id": "nsx-vlan-transportzone-423", "segmentation_id": 423, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap90470732-b8", "ovs_interfaceid": "90470732-b8ad-48c6-8132-acf6f0dded11", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2318.611777] env[62277]: DEBUG oslo_concurrency.lockutils [req-2ad1e0ac-3965-4034-b48f-2c3bba56f575 req-8a5ee678-9a30-4068-84c5-7276d137d5a3 service nova] Releasing lock "refresh_cache-26a7549d-94b4-4113-ab8b-10886eafcd49" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2318.634954] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2318.750838] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405510, 'name': CreateVM_Task, 'duration_secs': 0.314645} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2318.751061] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2318.751785] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2318.751946] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2318.752303] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2318.752579] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-befc0a49-0c39-4a10-8d85-b916f0752a1c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2318.757377] env[62277]: DEBUG oslo_vmware.api [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Waiting for the task: (returnval){ [ 2318.757377] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52dc5aba-02ca-ff20-7447-c48304696a6a" [ 2318.757377] env[62277]: _type = "Task" [ 2318.757377] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2318.764697] env[62277]: DEBUG oslo_vmware.api [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52dc5aba-02ca-ff20-7447-c48304696a6a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2318.923057] env[62277]: DEBUG nova.compute.manager [req-0ece4667-c896-46a9-9017-80594bfb300a req-2e80e0f8-71b1-43de-88b1-266d73cb6a66 service nova] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Received event network-vif-plugged-0e37cb63-3d96-442e-ad18-48d104e91e08 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2318.923251] env[62277]: DEBUG oslo_concurrency.lockutils [req-0ece4667-c896-46a9-9017-80594bfb300a req-2e80e0f8-71b1-43de-88b1-266d73cb6a66 service nova] Acquiring lock "295cb2fd-b409-4d5c-8fef-12b7acd9fec0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2318.923506] env[62277]: DEBUG oslo_concurrency.lockutils [req-0ece4667-c896-46a9-9017-80594bfb300a req-2e80e0f8-71b1-43de-88b1-266d73cb6a66 service nova] Lock "295cb2fd-b409-4d5c-8fef-12b7acd9fec0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2318.923719] env[62277]: DEBUG oslo_concurrency.lockutils [req-0ece4667-c896-46a9-9017-80594bfb300a req-2e80e0f8-71b1-43de-88b1-266d73cb6a66 service nova] Lock "295cb2fd-b409-4d5c-8fef-12b7acd9fec0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2318.923832] env[62277]: DEBUG nova.compute.manager [req-0ece4667-c896-46a9-9017-80594bfb300a req-2e80e0f8-71b1-43de-88b1-266d73cb6a66 service nova] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] No waiting events found dispatching network-vif-plugged-0e37cb63-3d96-442e-ad18-48d104e91e08 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2318.923992] env[62277]: WARNING nova.compute.manager [req-0ece4667-c896-46a9-9017-80594bfb300a req-2e80e0f8-71b1-43de-88b1-266d73cb6a66 service nova] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Received unexpected event network-vif-plugged-0e37cb63-3d96-442e-ad18-48d104e91e08 for instance with vm_state building and task_state spawning. [ 2318.973847] env[62277]: DEBUG nova.network.neutron [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Successfully updated port: 0e37cb63-3d96-442e-ad18-48d104e91e08 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2318.985288] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "refresh_cache-295cb2fd-b409-4d5c-8fef-12b7acd9fec0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2318.985422] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquired lock "refresh_cache-295cb2fd-b409-4d5c-8fef-12b7acd9fec0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2318.985574] env[62277]: DEBUG nova.network.neutron [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2319.022470] env[62277]: DEBUG nova.network.neutron [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2319.216447] env[62277]: DEBUG nova.network.neutron [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Updating instance_info_cache with network_info: [{"id": "0e37cb63-3d96-442e-ad18-48d104e91e08", "address": "fa:16:3e:34:52:f1", "network": {"id": "0fdd7696-a8f7-4fbe-88a6-4c5254049911", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-706519722-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "47f21dc2b2ad4fe692324779a4a84760", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7150f662-0cf1-44f9-ae14-d70f479649b6", "external-id": "nsx-vlan-transportzone-712", "segmentation_id": 712, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0e37cb63-3d", "ovs_interfaceid": "0e37cb63-3d96-442e-ad18-48d104e91e08", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2319.226870] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Releasing lock "refresh_cache-295cb2fd-b409-4d5c-8fef-12b7acd9fec0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2319.227172] env[62277]: DEBUG nova.compute.manager [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Instance network_info: |[{"id": "0e37cb63-3d96-442e-ad18-48d104e91e08", "address": "fa:16:3e:34:52:f1", "network": {"id": "0fdd7696-a8f7-4fbe-88a6-4c5254049911", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-706519722-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "47f21dc2b2ad4fe692324779a4a84760", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7150f662-0cf1-44f9-ae14-d70f479649b6", "external-id": "nsx-vlan-transportzone-712", "segmentation_id": 712, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0e37cb63-3d", "ovs_interfaceid": "0e37cb63-3d96-442e-ad18-48d104e91e08", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2319.227592] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:34:52:f1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7150f662-0cf1-44f9-ae14-d70f479649b6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0e37cb63-3d96-442e-ad18-48d104e91e08', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2319.235419] env[62277]: DEBUG oslo.service.loopingcall [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2319.235852] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2319.236125] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-97c8fb6a-7c31-47f4-af63-d750380acadb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2319.258192] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2319.258192] env[62277]: value = "task-1405511" [ 2319.258192] env[62277]: _type = "Task" [ 2319.258192] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2319.269598] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405511, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2319.272656] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2319.272897] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2319.273159] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2319.770739] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405511, 'name': CreateVM_Task, 'duration_secs': 0.268254} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2319.771050] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2319.771654] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2319.771818] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2319.772146] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2319.772401] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7a4c7771-5573-498a-878e-0121514d052a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2319.776791] env[62277]: DEBUG oslo_vmware.api [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for the task: (returnval){ [ 2319.776791] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52dc486e-6600-dfba-d8ca-c06e041c9ae7" [ 2319.776791] env[62277]: _type = "Task" [ 2319.776791] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2319.784274] env[62277]: DEBUG oslo_vmware.api [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52dc486e-6600-dfba-d8ca-c06e041c9ae7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2320.168581] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2320.168888] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2320.169087] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 2320.287463] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2320.287713] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2320.287926] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2320.972210] env[62277]: DEBUG nova.compute.manager [req-9ff30fbe-c3f3-4a45-8c68-b829884c4fcb req-c2dc9ee7-b6f5-4296-9de6-e8b5a116667f service nova] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Received event network-changed-0e37cb63-3d96-442e-ad18-48d104e91e08 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2320.972210] env[62277]: DEBUG nova.compute.manager [req-9ff30fbe-c3f3-4a45-8c68-b829884c4fcb req-c2dc9ee7-b6f5-4296-9de6-e8b5a116667f service nova] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Refreshing instance network info cache due to event network-changed-0e37cb63-3d96-442e-ad18-48d104e91e08. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 2320.972210] env[62277]: DEBUG oslo_concurrency.lockutils [req-9ff30fbe-c3f3-4a45-8c68-b829884c4fcb req-c2dc9ee7-b6f5-4296-9de6-e8b5a116667f service nova] Acquiring lock "refresh_cache-295cb2fd-b409-4d5c-8fef-12b7acd9fec0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2320.972492] env[62277]: DEBUG oslo_concurrency.lockutils [req-9ff30fbe-c3f3-4a45-8c68-b829884c4fcb req-c2dc9ee7-b6f5-4296-9de6-e8b5a116667f service nova] Acquired lock "refresh_cache-295cb2fd-b409-4d5c-8fef-12b7acd9fec0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2320.972492] env[62277]: DEBUG nova.network.neutron [req-9ff30fbe-c3f3-4a45-8c68-b829884c4fcb req-c2dc9ee7-b6f5-4296-9de6-e8b5a116667f service nova] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Refreshing network info cache for port 0e37cb63-3d96-442e-ad18-48d104e91e08 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2321.230213] env[62277]: DEBUG nova.network.neutron [req-9ff30fbe-c3f3-4a45-8c68-b829884c4fcb req-c2dc9ee7-b6f5-4296-9de6-e8b5a116667f service nova] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Updated VIF entry in instance network info cache for port 0e37cb63-3d96-442e-ad18-48d104e91e08. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2321.230570] env[62277]: DEBUG nova.network.neutron [req-9ff30fbe-c3f3-4a45-8c68-b829884c4fcb req-c2dc9ee7-b6f5-4296-9de6-e8b5a116667f service nova] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Updating instance_info_cache with network_info: [{"id": "0e37cb63-3d96-442e-ad18-48d104e91e08", "address": "fa:16:3e:34:52:f1", "network": {"id": "0fdd7696-a8f7-4fbe-88a6-4c5254049911", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-706519722-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "47f21dc2b2ad4fe692324779a4a84760", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7150f662-0cf1-44f9-ae14-d70f479649b6", "external-id": "nsx-vlan-transportzone-712", "segmentation_id": 712, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0e37cb63-3d", "ovs_interfaceid": "0e37cb63-3d96-442e-ad18-48d104e91e08", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2321.239875] env[62277]: DEBUG oslo_concurrency.lockutils [req-9ff30fbe-c3f3-4a45-8c68-b829884c4fcb req-c2dc9ee7-b6f5-4296-9de6-e8b5a116667f service nova] Releasing lock "refresh_cache-295cb2fd-b409-4d5c-8fef-12b7acd9fec0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2323.164696] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2362.975063] env[62277]: WARNING oslo_vmware.rw_handles [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2362.975063] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2362.975063] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2362.975063] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2362.975063] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2362.975063] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 2362.975063] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2362.975063] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2362.975063] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2362.975063] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2362.975063] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2362.975063] env[62277]: ERROR oslo_vmware.rw_handles [ 2362.975621] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/0dde662e-ad74-4745-ac67-87f037ae3caa/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2362.977756] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2362.978033] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Copying Virtual Disk [datastore2] vmware_temp/0dde662e-ad74-4745-ac67-87f037ae3caa/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/0dde662e-ad74-4745-ac67-87f037ae3caa/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2362.978335] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a2a84789-0972-4bb9-9021-2ec1574e1673 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2362.985734] env[62277]: DEBUG oslo_vmware.api [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Waiting for the task: (returnval){ [ 2362.985734] env[62277]: value = "task-1405512" [ 2362.985734] env[62277]: _type = "Task" [ 2362.985734] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2362.993645] env[62277]: DEBUG oslo_vmware.api [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Task: {'id': task-1405512, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2363.498484] env[62277]: DEBUG oslo_vmware.exceptions [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2363.498722] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2363.499271] env[62277]: ERROR nova.compute.manager [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2363.499271] env[62277]: Faults: ['InvalidArgument'] [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Traceback (most recent call last): [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] yield resources [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] self.driver.spawn(context, instance, image_meta, [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] self._fetch_image_if_missing(context, vi) [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] image_cache(vi, tmp_image_ds_loc) [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] vm_util.copy_virtual_disk( [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] session._wait_for_task(vmdk_copy_task) [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] return self.wait_for_task(task_ref) [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] return evt.wait() [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] result = hub.switch() [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] return self.greenlet.switch() [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] self.f(*self.args, **self.kw) [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] raise exceptions.translate_fault(task_info.error) [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Faults: ['InvalidArgument'] [ 2363.499271] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] [ 2363.500068] env[62277]: INFO nova.compute.manager [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Terminating instance [ 2363.501051] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2363.501258] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2363.501490] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-323fd6a6-db22-42c6-8213-1a63eef0d8ad {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2363.503429] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Acquiring lock "refresh_cache-8b9ef530-e79f-4cd4-8a88-83871ed65f90" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2363.503586] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Acquired lock "refresh_cache-8b9ef530-e79f-4cd4-8a88-83871ed65f90" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2363.503745] env[62277]: DEBUG nova.network.neutron [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2363.510220] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2363.510386] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2363.511054] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9e002d73-caf0-4aef-8527-14d1d650a3ba {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2363.518399] env[62277]: DEBUG oslo_vmware.api [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Waiting for the task: (returnval){ [ 2363.518399] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]529c0bd5-6859-b562-f8e2-6068cad93e61" [ 2363.518399] env[62277]: _type = "Task" [ 2363.518399] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2363.525488] env[62277]: DEBUG oslo_vmware.api [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]529c0bd5-6859-b562-f8e2-6068cad93e61, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2363.531516] env[62277]: DEBUG nova.network.neutron [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2363.589689] env[62277]: DEBUG nova.network.neutron [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2363.598695] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Releasing lock "refresh_cache-8b9ef530-e79f-4cd4-8a88-83871ed65f90" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2363.599102] env[62277]: DEBUG nova.compute.manager [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2363.599299] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2363.600363] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08bf2334-6eb3-4290-b1e3-d7b6a53d8fe5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2363.607947] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2363.608174] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-87f1591e-68d3-4009-907d-52c51ccf9ca8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2363.645698] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2363.645899] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2363.646081] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Deleting the datastore file [datastore2] 8b9ef530-e79f-4cd4-8a88-83871ed65f90 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2363.646315] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8013c3b1-19de-40b7-a720-1378f4651087 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2363.651784] env[62277]: DEBUG oslo_vmware.api [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Waiting for the task: (returnval){ [ 2363.651784] env[62277]: value = "task-1405514" [ 2363.651784] env[62277]: _type = "Task" [ 2363.651784] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2363.659519] env[62277]: DEBUG oslo_vmware.api [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Task: {'id': task-1405514, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2364.028582] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2364.028839] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Creating directory with path [datastore2] vmware_temp/59191d02-ea5c-4807-9ffb-58338d65e043/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2364.029076] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a39533aa-f1af-4404-b757-52c97be40de5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.039662] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Created directory with path [datastore2] vmware_temp/59191d02-ea5c-4807-9ffb-58338d65e043/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2364.039823] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Fetch image to [datastore2] vmware_temp/59191d02-ea5c-4807-9ffb-58338d65e043/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2364.039961] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/59191d02-ea5c-4807-9ffb-58338d65e043/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2364.040690] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59168506-ef4f-4d1b-a0b9-6a8d6979a899 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.047239] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7344f384-51e1-47d4-8179-6deeddaa6080 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.055957] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cdc224c-cc91-4c8a-9533-c47fce8f7b8c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.086666] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc52166d-8b44-49ec-abfb-9d6464feec22 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.092513] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-04fcfe40-aaad-4048-9af4-d43a5cb571e9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.113851] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2364.162014] env[62277]: DEBUG oslo_vmware.api [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Task: {'id': task-1405514, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.030531} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2364.162014] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2364.162014] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2364.162014] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2364.162014] env[62277]: INFO nova.compute.manager [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Took 0.56 seconds to destroy the instance on the hypervisor. [ 2364.162292] env[62277]: DEBUG oslo.service.loopingcall [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2364.162329] env[62277]: DEBUG nova.compute.manager [-] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Skipping network deallocation for instance since networking was not requested. {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2364.164396] env[62277]: DEBUG oslo_vmware.rw_handles [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/59191d02-ea5c-4807-9ffb-58338d65e043/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2364.165751] env[62277]: DEBUG nova.compute.claims [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2364.165917] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2364.166141] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2364.227818] env[62277]: DEBUG oslo_vmware.rw_handles [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2364.227818] env[62277]: DEBUG oslo_vmware.rw_handles [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/59191d02-ea5c-4807-9ffb-58338d65e043/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2364.342895] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e9f324c-354e-4211-bd97-06c1fc1ee467 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.351050] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-247eb3da-75ac-4d27-9be1-20d717fc1771 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.379505] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a9f49f3-4080-40e0-9593-8815b95ec16b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.387047] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e377477f-8af5-4a1b-9ed6-cd7a410af67d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.399588] env[62277]: DEBUG nova.compute.provider_tree [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2364.408097] env[62277]: DEBUG nova.scheduler.client.report [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2364.422588] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.256s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2364.423164] env[62277]: ERROR nova.compute.manager [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2364.423164] env[62277]: Faults: ['InvalidArgument'] [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Traceback (most recent call last): [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] self.driver.spawn(context, instance, image_meta, [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] self._fetch_image_if_missing(context, vi) [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] image_cache(vi, tmp_image_ds_loc) [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] vm_util.copy_virtual_disk( [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] session._wait_for_task(vmdk_copy_task) [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] return self.wait_for_task(task_ref) [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] return evt.wait() [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] result = hub.switch() [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] return self.greenlet.switch() [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] self.f(*self.args, **self.kw) [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] raise exceptions.translate_fault(task_info.error) [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Faults: ['InvalidArgument'] [ 2364.423164] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] [ 2364.423988] env[62277]: DEBUG nova.compute.utils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2364.425335] env[62277]: DEBUG nova.compute.manager [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Build of instance 8b9ef530-e79f-4cd4-8a88-83871ed65f90 was re-scheduled: A specified parameter was not correct: fileType [ 2364.425335] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2364.425699] env[62277]: DEBUG nova.compute.manager [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2364.425914] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Acquiring lock "refresh_cache-8b9ef530-e79f-4cd4-8a88-83871ed65f90" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2364.426126] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Acquired lock "refresh_cache-8b9ef530-e79f-4cd4-8a88-83871ed65f90" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2364.426232] env[62277]: DEBUG nova.network.neutron [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2364.449818] env[62277]: DEBUG nova.network.neutron [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2364.505866] env[62277]: DEBUG nova.network.neutron [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2364.514414] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Releasing lock "refresh_cache-8b9ef530-e79f-4cd4-8a88-83871ed65f90" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2364.514621] env[62277]: DEBUG nova.compute.manager [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2364.514800] env[62277]: DEBUG nova.compute.manager [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Skipping network deallocation for instance since networking was not requested. {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2364.596014] env[62277]: INFO nova.scheduler.client.report [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Deleted allocations for instance 8b9ef530-e79f-4cd4-8a88-83871ed65f90 [ 2364.614109] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9965c055-77b5-44e8-8ffc-4b65735b11de tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Lock "8b9ef530-e79f-4cd4-8a88-83871ed65f90" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 632.610s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2364.614398] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Lock "8b9ef530-e79f-4cd4-8a88-83871ed65f90" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 435.552s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2364.614617] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Acquiring lock "8b9ef530-e79f-4cd4-8a88-83871ed65f90-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2364.614819] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Lock "8b9ef530-e79f-4cd4-8a88-83871ed65f90-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2364.614998] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Lock "8b9ef530-e79f-4cd4-8a88-83871ed65f90-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2364.616913] env[62277]: INFO nova.compute.manager [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Terminating instance [ 2364.618472] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Acquiring lock "refresh_cache-8b9ef530-e79f-4cd4-8a88-83871ed65f90" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2364.618627] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Acquired lock "refresh_cache-8b9ef530-e79f-4cd4-8a88-83871ed65f90" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2364.618790] env[62277]: DEBUG nova.network.neutron [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2364.655743] env[62277]: DEBUG nova.network.neutron [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2364.721277] env[62277]: DEBUG nova.network.neutron [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2364.731122] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Releasing lock "refresh_cache-8b9ef530-e79f-4cd4-8a88-83871ed65f90" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2364.731519] env[62277]: DEBUG nova.compute.manager [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2364.731710] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2364.732263] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6c4b3b9c-3269-456c-9332-086e4fb8438e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.741661] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be97e7e0-2710-4fc6-ab27-1fb1ae7a5a48 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2364.768316] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8b9ef530-e79f-4cd4-8a88-83871ed65f90 could not be found. [ 2364.768501] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2364.768678] env[62277]: INFO nova.compute.manager [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2364.768904] env[62277]: DEBUG oslo.service.loopingcall [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2364.769141] env[62277]: DEBUG nova.compute.manager [-] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2364.769236] env[62277]: DEBUG nova.network.neutron [-] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2364.872573] env[62277]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=62277) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 2364.872844] env[62277]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-82d15664-78e4-4b69-a3f9-5ef9097652b5'] [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall self._deallocate_network( [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 2364.873364] env[62277]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 2364.874634] env[62277]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 2364.874634] env[62277]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 2364.874634] env[62277]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2364.874634] env[62277]: ERROR oslo.service.loopingcall [ 2364.874634] env[62277]: ERROR nova.compute.manager [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2364.911256] env[62277]: ERROR nova.compute.manager [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Traceback (most recent call last): [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] ret = obj(*args, **kwargs) [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] exception_handler_v20(status_code, error_body) [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] raise client_exc(message=error_message, [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Neutron server returns request_ids: ['req-82d15664-78e4-4b69-a3f9-5ef9097652b5'] [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] During handling of the above exception, another exception occurred: [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Traceback (most recent call last): [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] self._delete_instance(context, instance, bdms) [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] self._shutdown_instance(context, instance, bdms) [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] self._try_deallocate_network(context, instance, requested_networks) [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] with excutils.save_and_reraise_exception(): [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] self.force_reraise() [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] raise self.value [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] _deallocate_network_with_retries() [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] return evt.wait() [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2364.911256] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] result = hub.switch() [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] return self.greenlet.switch() [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] result = func(*self.args, **self.kw) [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] result = f(*args, **kwargs) [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] self._deallocate_network( [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] self.network_api.deallocate_for_instance( [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] data = neutron.list_ports(**search_opts) [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] ret = obj(*args, **kwargs) [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] return self.list('ports', self.ports_path, retrieve_all, [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] ret = obj(*args, **kwargs) [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] for r in self._pagination(collection, path, **params): [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] res = self.get(path, params=params) [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] ret = obj(*args, **kwargs) [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] return self.retry_request("GET", action, body=body, [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] ret = obj(*args, **kwargs) [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] return self.do_request(method, action, body=body, [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] ret = obj(*args, **kwargs) [ 2364.912218] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 2364.913106] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] self._handle_fault_response(status_code, replybody, resp) [ 2364.913106] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 2364.913106] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 2364.913106] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2364.913106] env[62277]: ERROR nova.compute.manager [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] [ 2364.937484] env[62277]: DEBUG oslo_concurrency.lockutils [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Lock "8b9ef530-e79f-4cd4-8a88-83871ed65f90" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.323s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2364.979325] env[62277]: INFO nova.compute.manager [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] [instance: 8b9ef530-e79f-4cd4-8a88-83871ed65f90] Successfully reverted task state from None on failure for instance. [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server [None req-0066ec6d-492d-414e-babb-2c6a13a6a502 tempest-ServerShowV247Test-2092232574 tempest-ServerShowV247Test-2092232574-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-82d15664-78e4-4b69-a3f9-5ef9097652b5'] [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 2364.982434] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server raise self.value [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server return evt.wait() [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2364.983739] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 2364.984999] env[62277]: ERROR oslo_messaging.rpc.server [ 2372.170617] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2374.168157] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2377.163711] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2378.168562] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2378.168904] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 2378.168904] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 2378.189544] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2378.189689] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2378.189824] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2378.189950] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 297d53df-7918-4389-9c63-a600755da969] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2378.190104] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2378.190262] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2378.190393] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2378.190516] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2378.190636] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2378.190755] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 2378.191227] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2379.168261] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2379.168577] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2379.179848] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2379.180084] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2379.180262] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2379.180443] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2379.181524] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40f445b5-3b66-4e8e-9398-935ad34a455c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2379.190401] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbf6b590-60dc-423b-8f83-82c0e064b22a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2379.205347] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27cf4bc1-95de-4563-a09a-69608f9b160c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2379.211429] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df18e61b-d533-461d-a70e-4fb519061e3e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2379.239806] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181343MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2379.239970] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2379.240100] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2379.309666] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance c4c22c8a-4300-45ce-8484-77c638f7bbc5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2379.309835] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance ad83bf06-d712-4bd4-8086-9c3b615adaf5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2379.309960] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance de543a46-26c3-40b3-9ccd-80bb1f9845d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2379.310123] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 297d53df-7918-4389-9c63-a600755da969 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2379.310227] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 940561d5-723b-4e43-8fab-35e8af95ce09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2379.310306] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6737c3b9-d9e6-4879-a6df-46d3c7dee40e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2379.310511] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4f26ed27-558d-489a-9141-ec63b6164cc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2379.310709] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 26a7549d-94b4-4113-ab8b-10886eafcd49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2379.310922] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 295cb2fd-b409-4d5c-8fef-12b7acd9fec0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2379.311266] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2379.311504] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2379.424172] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b0e94fe-e784-490c-b4d3-b298e388803e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2379.431867] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4475de6f-8124-4d54-9771-6c1ca0dd50f6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2379.461958] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-055410fd-6844-4805-b4b9-213da0a9aa08 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2379.469135] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ebe58b4-6be3-4a8d-8a23-a3f5af37de18 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2379.482008] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2379.490488] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2379.504578] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2379.504761] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.265s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2380.505637] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2380.505985] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 2381.169045] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2396.710492] env[62277]: DEBUG oslo_concurrency.lockutils [None req-97cb2248-d5b7-4933-b0e6-6c69a74f9c0c tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "4f26ed27-558d-489a-9141-ec63b6164cc8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2410.079851] env[62277]: WARNING oslo_vmware.rw_handles [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2410.079851] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2410.079851] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2410.079851] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2410.079851] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2410.079851] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 2410.079851] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2410.079851] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2410.079851] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2410.079851] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2410.079851] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2410.079851] env[62277]: ERROR oslo_vmware.rw_handles [ 2410.079851] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/59191d02-ea5c-4807-9ffb-58338d65e043/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2410.081835] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2410.082107] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Copying Virtual Disk [datastore2] vmware_temp/59191d02-ea5c-4807-9ffb-58338d65e043/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/59191d02-ea5c-4807-9ffb-58338d65e043/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2410.082441] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-dd918637-3cbf-4be8-98f3-5680a9c6d021 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2410.090687] env[62277]: DEBUG oslo_vmware.api [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Waiting for the task: (returnval){ [ 2410.090687] env[62277]: value = "task-1405515" [ 2410.090687] env[62277]: _type = "Task" [ 2410.090687] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2410.099202] env[62277]: DEBUG oslo_vmware.api [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Task: {'id': task-1405515, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2410.600619] env[62277]: DEBUG oslo_vmware.exceptions [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2410.600897] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2410.601476] env[62277]: ERROR nova.compute.manager [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2410.601476] env[62277]: Faults: ['InvalidArgument'] [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Traceback (most recent call last): [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] yield resources [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] self.driver.spawn(context, instance, image_meta, [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] self._fetch_image_if_missing(context, vi) [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] image_cache(vi, tmp_image_ds_loc) [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] vm_util.copy_virtual_disk( [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] session._wait_for_task(vmdk_copy_task) [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] return self.wait_for_task(task_ref) [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] return evt.wait() [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] result = hub.switch() [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] return self.greenlet.switch() [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] self.f(*self.args, **self.kw) [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] raise exceptions.translate_fault(task_info.error) [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Faults: ['InvalidArgument'] [ 2410.601476] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] [ 2410.602387] env[62277]: INFO nova.compute.manager [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Terminating instance [ 2410.603359] env[62277]: DEBUG oslo_concurrency.lockutils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2410.603557] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2410.603802] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-212bb2f0-ff68-4fbc-aa40-c23a7f40ab86 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2410.606226] env[62277]: DEBUG nova.compute.manager [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2410.606405] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2410.607119] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0535899a-08e3-47a0-ab66-f69410d827b5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2410.613490] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2410.613714] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-90b5bd5e-db00-448b-87ad-592094d3b0e5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2410.615824] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2410.615988] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2410.616904] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c9ee077b-a481-47fd-9d6a-16b82d0ae18d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2410.621489] env[62277]: DEBUG oslo_vmware.api [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Waiting for the task: (returnval){ [ 2410.621489] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5242e06b-b23c-e860-b206-b35de99ae1c6" [ 2410.621489] env[62277]: _type = "Task" [ 2410.621489] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2410.628400] env[62277]: DEBUG oslo_vmware.api [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5242e06b-b23c-e860-b206-b35de99ae1c6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2410.677216] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2410.677457] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2410.677612] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Deleting the datastore file [datastore2] c4c22c8a-4300-45ce-8484-77c638f7bbc5 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2410.677888] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-388cd704-61c6-40a5-8cd4-cccf3a530742 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2410.684192] env[62277]: DEBUG oslo_vmware.api [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Waiting for the task: (returnval){ [ 2410.684192] env[62277]: value = "task-1405517" [ 2410.684192] env[62277]: _type = "Task" [ 2410.684192] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2410.691983] env[62277]: DEBUG oslo_vmware.api [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Task: {'id': task-1405517, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2411.133270] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2411.133611] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Creating directory with path [datastore2] vmware_temp/a7a5e793-cae5-4780-9e83-c9a1713c0eb6/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2411.133842] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5fdc88d8-70ea-4e92-80e2-406bb8e74d39 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.145536] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Created directory with path [datastore2] vmware_temp/a7a5e793-cae5-4780-9e83-c9a1713c0eb6/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2411.145721] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Fetch image to [datastore2] vmware_temp/a7a5e793-cae5-4780-9e83-c9a1713c0eb6/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2411.145894] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/a7a5e793-cae5-4780-9e83-c9a1713c0eb6/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2411.146615] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b99b3525-b2cb-4f79-bd07-1a89af918b7c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.153207] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc3358ad-1ddb-4beb-8508-8d4a79f064c1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.162774] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5da4ae0-9d49-4258-9791-fdc1eed23429 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.194697] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35b9d9b9-1322-4766-8de8-f2bcf573ea79 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.201497] env[62277]: DEBUG oslo_vmware.api [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Task: {'id': task-1405517, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082732} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2411.202929] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2411.203131] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2411.203299] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2411.203476] env[62277]: INFO nova.compute.manager [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2411.205217] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-94392ef3-c6cc-48e3-a6f2-1fe221c89347 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.207104] env[62277]: DEBUG nova.compute.claims [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2411.207274] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2411.207503] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2411.235351] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2411.289168] env[62277]: DEBUG oslo_vmware.rw_handles [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a7a5e793-cae5-4780-9e83-c9a1713c0eb6/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2411.349222] env[62277]: DEBUG oslo_vmware.rw_handles [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2411.349410] env[62277]: DEBUG oslo_vmware.rw_handles [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a7a5e793-cae5-4780-9e83-c9a1713c0eb6/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2411.428552] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61d81136-548d-4c9a-9eb0-af77e6782839 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.436029] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-611c06c2-b349-46ea-8a1e-95f2ac3b3b7f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.467054] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c2f5848-7ac4-4392-809e-bc0d40d1619b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.473983] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50637b5a-28cd-4236-9d78-4ea6089462a4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.486880] env[62277]: DEBUG nova.compute.provider_tree [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2411.495063] env[62277]: DEBUG nova.scheduler.client.report [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2411.508228] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.301s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2411.508743] env[62277]: ERROR nova.compute.manager [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2411.508743] env[62277]: Faults: ['InvalidArgument'] [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Traceback (most recent call last): [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] self.driver.spawn(context, instance, image_meta, [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] self._fetch_image_if_missing(context, vi) [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] image_cache(vi, tmp_image_ds_loc) [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] vm_util.copy_virtual_disk( [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] session._wait_for_task(vmdk_copy_task) [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] return self.wait_for_task(task_ref) [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] return evt.wait() [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] result = hub.switch() [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] return self.greenlet.switch() [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] self.f(*self.args, **self.kw) [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] raise exceptions.translate_fault(task_info.error) [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Faults: ['InvalidArgument'] [ 2411.508743] env[62277]: ERROR nova.compute.manager [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] [ 2411.509710] env[62277]: DEBUG nova.compute.utils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2411.510783] env[62277]: DEBUG nova.compute.manager [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Build of instance c4c22c8a-4300-45ce-8484-77c638f7bbc5 was re-scheduled: A specified parameter was not correct: fileType [ 2411.510783] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2411.511166] env[62277]: DEBUG nova.compute.manager [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2411.511336] env[62277]: DEBUG nova.compute.manager [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2411.511506] env[62277]: DEBUG nova.compute.manager [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2411.511664] env[62277]: DEBUG nova.network.neutron [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2411.777169] env[62277]: DEBUG nova.network.neutron [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2411.787416] env[62277]: INFO nova.compute.manager [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Took 0.28 seconds to deallocate network for instance. [ 2411.875348] env[62277]: INFO nova.scheduler.client.report [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Deleted allocations for instance c4c22c8a-4300-45ce-8484-77c638f7bbc5 [ 2411.898157] env[62277]: DEBUG oslo_concurrency.lockutils [None req-c16587db-cd3b-4577-9940-f175bd2c94dd tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "c4c22c8a-4300-45ce-8484-77c638f7bbc5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 604.385s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2411.898526] env[62277]: DEBUG oslo_concurrency.lockutils [None req-231d0e7a-82be-4285-b16a-80974a83b196 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "c4c22c8a-4300-45ce-8484-77c638f7bbc5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 407.707s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2411.898778] env[62277]: DEBUG oslo_concurrency.lockutils [None req-231d0e7a-82be-4285-b16a-80974a83b196 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "c4c22c8a-4300-45ce-8484-77c638f7bbc5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2411.898998] env[62277]: DEBUG oslo_concurrency.lockutils [None req-231d0e7a-82be-4285-b16a-80974a83b196 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "c4c22c8a-4300-45ce-8484-77c638f7bbc5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2411.899188] env[62277]: DEBUG oslo_concurrency.lockutils [None req-231d0e7a-82be-4285-b16a-80974a83b196 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "c4c22c8a-4300-45ce-8484-77c638f7bbc5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2411.901510] env[62277]: INFO nova.compute.manager [None req-231d0e7a-82be-4285-b16a-80974a83b196 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Terminating instance [ 2411.903270] env[62277]: DEBUG nova.compute.manager [None req-231d0e7a-82be-4285-b16a-80974a83b196 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2411.903456] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-231d0e7a-82be-4285-b16a-80974a83b196 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2411.903939] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f963f353-4caf-45df-b6c4-57f1fd2607b6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.914216] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eba0a31f-2e7c-4084-bb38-6304c2664bf1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2411.942389] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-231d0e7a-82be-4285-b16a-80974a83b196 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c4c22c8a-4300-45ce-8484-77c638f7bbc5 could not be found. [ 2411.942660] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-231d0e7a-82be-4285-b16a-80974a83b196 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2411.942878] env[62277]: INFO nova.compute.manager [None req-231d0e7a-82be-4285-b16a-80974a83b196 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2411.943146] env[62277]: DEBUG oslo.service.loopingcall [None req-231d0e7a-82be-4285-b16a-80974a83b196 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2411.943376] env[62277]: DEBUG nova.compute.manager [-] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2411.943471] env[62277]: DEBUG nova.network.neutron [-] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2411.972837] env[62277]: DEBUG nova.network.neutron [-] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2411.980667] env[62277]: INFO nova.compute.manager [-] [instance: c4c22c8a-4300-45ce-8484-77c638f7bbc5] Took 0.04 seconds to deallocate network for instance. [ 2412.063907] env[62277]: DEBUG oslo_concurrency.lockutils [None req-231d0e7a-82be-4285-b16a-80974a83b196 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "c4c22c8a-4300-45ce-8484-77c638f7bbc5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.165s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2426.168688] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2426.169025] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Cleaning up deleted instances {{(pid=62277) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11196}} [ 2426.177896] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] There are 0 instances to clean {{(pid=62277) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11205}} [ 2431.127761] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7353d946-5020-413c-a8c2-fdbc24ed782d tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquiring lock "26a7549d-94b4-4113-ab8b-10886eafcd49" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2432.177535] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2434.168394] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2436.169448] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2436.169771] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Cleaning up deleted instances with incomplete migration {{(pid=62277) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11234}} [ 2437.173532] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2437.811738] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "5683f242-4848-42fa-9353-46982c3a72c0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2437.811968] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "5683f242-4848-42fa-9353-46982c3a72c0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2437.823679] env[62277]: DEBUG nova.compute.manager [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2437.956280] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2437.956534] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2437.957999] env[62277]: INFO nova.compute.claims [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2438.015161] env[62277]: DEBUG nova.scheduler.client.report [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Refreshing inventories for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2438.028497] env[62277]: DEBUG nova.scheduler.client.report [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Updating ProviderTree inventory for provider 75e125ea-a599-4b65-b9cd-6ea881735292 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2438.028710] env[62277]: DEBUG nova.compute.provider_tree [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Updating inventory in ProviderTree for provider 75e125ea-a599-4b65-b9cd-6ea881735292 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2438.039560] env[62277]: DEBUG nova.scheduler.client.report [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Refreshing aggregate associations for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292, aggregates: None {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2438.057383] env[62277]: DEBUG nova.scheduler.client.report [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Refreshing trait associations for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2438.163452] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc0ba6c7-d185-4b82-9e4b-d7490dd036fa {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2438.171306] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e26e494-2be7-4c5a-82ca-c07f67d974b7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2438.199866] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48605fe5-6d5b-473a-8054-43102f9d75c3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2438.206826] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da035e74-57a8-419f-9338-c7a8eac05fb5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2438.219505] env[62277]: DEBUG nova.compute.provider_tree [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2438.227971] env[62277]: DEBUG nova.scheduler.client.report [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2438.242343] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.286s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2438.242809] env[62277]: DEBUG nova.compute.manager [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2438.272876] env[62277]: DEBUG nova.compute.utils [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2438.274298] env[62277]: DEBUG nova.compute.manager [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2438.274466] env[62277]: DEBUG nova.network.neutron [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2438.282360] env[62277]: DEBUG nova.compute.manager [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2438.329156] env[62277]: DEBUG nova.policy [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '00ed93b61873452bbc15280d2de65bd8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c951cee39d94e49af963590cccf95fb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 2438.344173] env[62277]: DEBUG nova.compute.manager [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2438.370051] env[62277]: DEBUG nova.virt.hardware [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2438.370314] env[62277]: DEBUG nova.virt.hardware [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2438.370468] env[62277]: DEBUG nova.virt.hardware [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2438.370651] env[62277]: DEBUG nova.virt.hardware [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2438.370794] env[62277]: DEBUG nova.virt.hardware [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2438.370940] env[62277]: DEBUG nova.virt.hardware [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2438.371163] env[62277]: DEBUG nova.virt.hardware [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2438.371321] env[62277]: DEBUG nova.virt.hardware [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2438.371480] env[62277]: DEBUG nova.virt.hardware [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2438.371638] env[62277]: DEBUG nova.virt.hardware [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2438.371807] env[62277]: DEBUG nova.virt.hardware [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2438.372743] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c9e5269-edd6-496c-ac9c-64c61f6f2197 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2438.380875] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a106cc2d-4b20-449e-bb79-c37b82605eaf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2438.625722] env[62277]: DEBUG nova.network.neutron [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Successfully created port: c1c2ce02-2543-4290-bc95-8110dce729c5 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2439.196364] env[62277]: DEBUG nova.compute.manager [req-5e4333c7-ab31-418d-b6e3-fd8ce35d3c91 req-f00bc9e0-7254-4011-9c87-fd3c3d50d8d9 service nova] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Received event network-vif-plugged-c1c2ce02-2543-4290-bc95-8110dce729c5 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2439.196587] env[62277]: DEBUG oslo_concurrency.lockutils [req-5e4333c7-ab31-418d-b6e3-fd8ce35d3c91 req-f00bc9e0-7254-4011-9c87-fd3c3d50d8d9 service nova] Acquiring lock "5683f242-4848-42fa-9353-46982c3a72c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2439.196790] env[62277]: DEBUG oslo_concurrency.lockutils [req-5e4333c7-ab31-418d-b6e3-fd8ce35d3c91 req-f00bc9e0-7254-4011-9c87-fd3c3d50d8d9 service nova] Lock "5683f242-4848-42fa-9353-46982c3a72c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2439.196957] env[62277]: DEBUG oslo_concurrency.lockutils [req-5e4333c7-ab31-418d-b6e3-fd8ce35d3c91 req-f00bc9e0-7254-4011-9c87-fd3c3d50d8d9 service nova] Lock "5683f242-4848-42fa-9353-46982c3a72c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2439.197152] env[62277]: DEBUG nova.compute.manager [req-5e4333c7-ab31-418d-b6e3-fd8ce35d3c91 req-f00bc9e0-7254-4011-9c87-fd3c3d50d8d9 service nova] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] No waiting events found dispatching network-vif-plugged-c1c2ce02-2543-4290-bc95-8110dce729c5 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2439.197332] env[62277]: WARNING nova.compute.manager [req-5e4333c7-ab31-418d-b6e3-fd8ce35d3c91 req-f00bc9e0-7254-4011-9c87-fd3c3d50d8d9 service nova] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Received unexpected event network-vif-plugged-c1c2ce02-2543-4290-bc95-8110dce729c5 for instance with vm_state building and task_state spawning. [ 2439.272413] env[62277]: DEBUG nova.network.neutron [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Successfully updated port: c1c2ce02-2543-4290-bc95-8110dce729c5 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2439.287357] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "refresh_cache-5683f242-4848-42fa-9353-46982c3a72c0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2439.287501] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquired lock "refresh_cache-5683f242-4848-42fa-9353-46982c3a72c0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2439.287648] env[62277]: DEBUG nova.network.neutron [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2439.322902] env[62277]: DEBUG nova.network.neutron [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2439.473286] env[62277]: DEBUG nova.network.neutron [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Updating instance_info_cache with network_info: [{"id": "c1c2ce02-2543-4290-bc95-8110dce729c5", "address": "fa:16:3e:3b:98:c7", "network": {"id": "73ff19f8-2a7e-46b5-ad84-44eb5672dd85", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1025699415-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c951cee39d94e49af963590cccf95fb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "09bf081b-cdf0-4977-abe2-2339a87409ab", "external-id": "nsx-vlan-transportzone-378", "segmentation_id": 378, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc1c2ce02-25", "ovs_interfaceid": "c1c2ce02-2543-4290-bc95-8110dce729c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2439.485855] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Releasing lock "refresh_cache-5683f242-4848-42fa-9353-46982c3a72c0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2439.486151] env[62277]: DEBUG nova.compute.manager [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Instance network_info: |[{"id": "c1c2ce02-2543-4290-bc95-8110dce729c5", "address": "fa:16:3e:3b:98:c7", "network": {"id": "73ff19f8-2a7e-46b5-ad84-44eb5672dd85", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1025699415-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c951cee39d94e49af963590cccf95fb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "09bf081b-cdf0-4977-abe2-2339a87409ab", "external-id": "nsx-vlan-transportzone-378", "segmentation_id": 378, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc1c2ce02-25", "ovs_interfaceid": "c1c2ce02-2543-4290-bc95-8110dce729c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2439.486518] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3b:98:c7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '09bf081b-cdf0-4977-abe2-2339a87409ab', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c1c2ce02-2543-4290-bc95-8110dce729c5', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2439.494016] env[62277]: DEBUG oslo.service.loopingcall [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2439.494431] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2439.494645] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a44d8f09-394c-4b41-ab11-8e358709714b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2439.513663] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2439.513663] env[62277]: value = "task-1405518" [ 2439.513663] env[62277]: _type = "Task" [ 2439.513663] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2439.520786] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405518, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2440.023688] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405518, 'name': CreateVM_Task, 'duration_secs': 0.280101} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2440.023807] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2440.024471] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2440.024633] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2440.025017] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2440.025269] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-64cec95d-7dcf-4e16-bdea-4c613bcfecf3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.029498] env[62277]: DEBUG oslo_vmware.api [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Waiting for the task: (returnval){ [ 2440.029498] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5263691a-86f8-785f-75be-e4de77ffa09e" [ 2440.029498] env[62277]: _type = "Task" [ 2440.029498] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2440.036572] env[62277]: DEBUG oslo_vmware.api [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5263691a-86f8-785f-75be-e4de77ffa09e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2440.168341] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2440.168523] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 2440.168647] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 2440.190092] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2440.190290] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2440.190423] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 297d53df-7918-4389-9c63-a600755da969] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2440.190551] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2440.190674] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2440.190795] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2440.190914] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2440.191043] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2440.191163] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2440.191282] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 2440.191761] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2440.191937] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2440.192106] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2440.202722] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2440.202941] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2440.203122] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2440.203276] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2440.204337] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06ee7baa-9073-4299-9b1f-c35b9ed088e8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.213053] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-293eb87c-298d-4c3c-9adb-9cff4cc31c4d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.226786] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bcf20de-01a4-4ca8-9b8d-d614055843d0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.233159] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69331a50-e844-4ee6-b2e0-53e75b6de00d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.262810] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181369MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2440.262975] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2440.263180] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2440.327687] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance ad83bf06-d712-4bd4-8086-9c3b615adaf5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2440.327947] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance de543a46-26c3-40b3-9ccd-80bb1f9845d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2440.327985] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 297d53df-7918-4389-9c63-a600755da969 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2440.328093] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 940561d5-723b-4e43-8fab-35e8af95ce09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2440.328216] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6737c3b9-d9e6-4879-a6df-46d3c7dee40e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2440.328332] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4f26ed27-558d-489a-9141-ec63b6164cc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2440.328449] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 26a7549d-94b4-4113-ab8b-10886eafcd49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2440.328563] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 295cb2fd-b409-4d5c-8fef-12b7acd9fec0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2440.328676] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 5683f242-4848-42fa-9353-46982c3a72c0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2440.328854] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2440.328989] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2440.426383] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-522fa1d1-f7c6-4843-ac04-ee5aadfa16c5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.433762] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f956e28d-d145-44e7-b876-268c9dd1db5f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.462365] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70271d2c-25df-463b-96fb-2c64d73fcc92 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.468884] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-264fa1b5-c256-4e95-bd6d-a2895aa1f45e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2440.481967] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2440.489565] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2440.502066] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2440.502237] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.239s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2440.538763] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2440.538976] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2440.539203] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2441.225579] env[62277]: DEBUG nova.compute.manager [req-4619f2e4-bd1b-4af1-b7e0-3fcd7fc5c4d7 req-6fc13d3a-fbb4-400c-b9eb-46d438276710 service nova] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Received event network-changed-c1c2ce02-2543-4290-bc95-8110dce729c5 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2441.225682] env[62277]: DEBUG nova.compute.manager [req-4619f2e4-bd1b-4af1-b7e0-3fcd7fc5c4d7 req-6fc13d3a-fbb4-400c-b9eb-46d438276710 service nova] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Refreshing instance network info cache due to event network-changed-c1c2ce02-2543-4290-bc95-8110dce729c5. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 2441.226276] env[62277]: DEBUG oslo_concurrency.lockutils [req-4619f2e4-bd1b-4af1-b7e0-3fcd7fc5c4d7 req-6fc13d3a-fbb4-400c-b9eb-46d438276710 service nova] Acquiring lock "refresh_cache-5683f242-4848-42fa-9353-46982c3a72c0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2441.226432] env[62277]: DEBUG oslo_concurrency.lockutils [req-4619f2e4-bd1b-4af1-b7e0-3fcd7fc5c4d7 req-6fc13d3a-fbb4-400c-b9eb-46d438276710 service nova] Acquired lock "refresh_cache-5683f242-4848-42fa-9353-46982c3a72c0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2441.226592] env[62277]: DEBUG nova.network.neutron [req-4619f2e4-bd1b-4af1-b7e0-3fcd7fc5c4d7 req-6fc13d3a-fbb4-400c-b9eb-46d438276710 service nova] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Refreshing network info cache for port c1c2ce02-2543-4290-bc95-8110dce729c5 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2441.498099] env[62277]: DEBUG nova.network.neutron [req-4619f2e4-bd1b-4af1-b7e0-3fcd7fc5c4d7 req-6fc13d3a-fbb4-400c-b9eb-46d438276710 service nova] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Updated VIF entry in instance network info cache for port c1c2ce02-2543-4290-bc95-8110dce729c5. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2441.498447] env[62277]: DEBUG nova.network.neutron [req-4619f2e4-bd1b-4af1-b7e0-3fcd7fc5c4d7 req-6fc13d3a-fbb4-400c-b9eb-46d438276710 service nova] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Updating instance_info_cache with network_info: [{"id": "c1c2ce02-2543-4290-bc95-8110dce729c5", "address": "fa:16:3e:3b:98:c7", "network": {"id": "73ff19f8-2a7e-46b5-ad84-44eb5672dd85", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1025699415-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c951cee39d94e49af963590cccf95fb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "09bf081b-cdf0-4977-abe2-2339a87409ab", "external-id": "nsx-vlan-transportzone-378", "segmentation_id": 378, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc1c2ce02-25", "ovs_interfaceid": "c1c2ce02-2543-4290-bc95-8110dce729c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2441.507834] env[62277]: DEBUG oslo_concurrency.lockutils [req-4619f2e4-bd1b-4af1-b7e0-3fcd7fc5c4d7 req-6fc13d3a-fbb4-400c-b9eb-46d438276710 service nova] Releasing lock "refresh_cache-5683f242-4848-42fa-9353-46982c3a72c0" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2442.478930] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2442.479131] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 2443.168571] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2448.091703] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2448.092249] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Getting list of instances from cluster (obj){ [ 2448.092249] env[62277]: value = "domain-c8" [ 2448.092249] env[62277]: _type = "ClusterComputeResource" [ 2448.092249] env[62277]: } {{(pid=62277) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2448.093305] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f52b9a90-ab11-4b4d-8ba0-293c891761c4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2448.109141] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Got total of 9 instances {{(pid=62277) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2448.197444] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2450.168597] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2458.008891] env[62277]: WARNING oslo_vmware.rw_handles [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2458.008891] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2458.008891] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2458.008891] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2458.008891] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2458.008891] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 2458.008891] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2458.008891] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2458.008891] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2458.008891] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2458.008891] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2458.008891] env[62277]: ERROR oslo_vmware.rw_handles [ 2458.009539] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/a7a5e793-cae5-4780-9e83-c9a1713c0eb6/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2458.011566] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2458.011802] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Copying Virtual Disk [datastore2] vmware_temp/a7a5e793-cae5-4780-9e83-c9a1713c0eb6/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/a7a5e793-cae5-4780-9e83-c9a1713c0eb6/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2458.012129] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0b0269fa-bb74-4fe7-b4a9-33b2ddbc2947 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2458.020145] env[62277]: DEBUG oslo_vmware.api [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Waiting for the task: (returnval){ [ 2458.020145] env[62277]: value = "task-1405519" [ 2458.020145] env[62277]: _type = "Task" [ 2458.020145] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2458.028083] env[62277]: DEBUG oslo_vmware.api [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Task: {'id': task-1405519, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2458.535618] env[62277]: DEBUG oslo_vmware.exceptions [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2458.536066] env[62277]: DEBUG oslo_concurrency.lockutils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2458.536892] env[62277]: ERROR nova.compute.manager [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2458.536892] env[62277]: Faults: ['InvalidArgument'] [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Traceback (most recent call last): [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] yield resources [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] self.driver.spawn(context, instance, image_meta, [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] self._fetch_image_if_missing(context, vi) [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] image_cache(vi, tmp_image_ds_loc) [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] vm_util.copy_virtual_disk( [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] session._wait_for_task(vmdk_copy_task) [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] return self.wait_for_task(task_ref) [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] return evt.wait() [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] result = hub.switch() [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] return self.greenlet.switch() [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] self.f(*self.args, **self.kw) [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] raise exceptions.translate_fault(task_info.error) [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Faults: ['InvalidArgument'] [ 2458.536892] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] [ 2458.537897] env[62277]: INFO nova.compute.manager [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Terminating instance [ 2458.539533] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2458.539844] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2458.540217] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9923c90f-1d76-4771-95ca-e6f1afa11905 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2458.543644] env[62277]: DEBUG nova.compute.manager [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2458.543940] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2458.545128] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7431f08f-680f-4b8a-a2d4-076757c79009 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2458.555053] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2458.556644] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0e54631a-e315-442a-bd23-7d4706a41008 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2458.558902] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2458.559190] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2458.560254] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e14068f7-ab1d-497d-adac-6743dec0dc82 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2458.567229] env[62277]: DEBUG oslo_vmware.api [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for the task: (returnval){ [ 2458.567229] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52d16a4a-0495-14a2-6739-50537b5edb71" [ 2458.567229] env[62277]: _type = "Task" [ 2458.567229] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2458.578359] env[62277]: DEBUG oslo_vmware.api [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52d16a4a-0495-14a2-6739-50537b5edb71, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2458.637213] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2458.637435] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2458.637602] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Deleting the datastore file [datastore2] ad83bf06-d712-4bd4-8086-9c3b615adaf5 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2458.637871] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c0776740-a2f0-4ddd-91b8-afe68878b3c1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2458.646996] env[62277]: DEBUG oslo_vmware.api [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Waiting for the task: (returnval){ [ 2458.646996] env[62277]: value = "task-1405521" [ 2458.646996] env[62277]: _type = "Task" [ 2458.646996] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2458.655025] env[62277]: DEBUG oslo_vmware.api [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Task: {'id': task-1405521, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2459.078396] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2459.078820] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Creating directory with path [datastore2] vmware_temp/950eb181-6cf3-4c12-b1ce-6b418cef867b/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2459.078914] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a90eb1a0-211c-4ba1-a7a2-e86cc878e747 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2459.090962] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Created directory with path [datastore2] vmware_temp/950eb181-6cf3-4c12-b1ce-6b418cef867b/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2459.091173] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Fetch image to [datastore2] vmware_temp/950eb181-6cf3-4c12-b1ce-6b418cef867b/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2459.091341] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/950eb181-6cf3-4c12-b1ce-6b418cef867b/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2459.092131] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18ce3c82-c2a8-4f88-af03-ff14046afd99 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2459.098923] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43623b5f-1b04-4909-b656-e9c3e9bccd4b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2459.108274] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-957365bf-89fc-40ab-ac11-9e1898d0c9e5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2459.139414] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16b5a9e5-c9c2-4982-85d8-e72b23d0b339 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2459.145635] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4dc12ff2-9bc1-4fdf-b314-4267de106881 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2459.155462] env[62277]: DEBUG oslo_vmware.api [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Task: {'id': task-1405521, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075218} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2459.155696] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2459.155877] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2459.156062] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2459.156276] env[62277]: INFO nova.compute.manager [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2459.158388] env[62277]: DEBUG nova.compute.claims [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2459.158551] env[62277]: DEBUG oslo_concurrency.lockutils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2459.158771] env[62277]: DEBUG oslo_concurrency.lockutils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2459.169465] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2459.227642] env[62277]: DEBUG oslo_vmware.rw_handles [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/950eb181-6cf3-4c12-b1ce-6b418cef867b/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2459.288354] env[62277]: DEBUG oslo_vmware.rw_handles [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2459.288510] env[62277]: DEBUG oslo_vmware.rw_handles [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/950eb181-6cf3-4c12-b1ce-6b418cef867b/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2459.390560] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b68dc43-cbee-4766-b8af-d01ac31b01f0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2459.398209] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84a29235-0353-4c5e-b843-b5a813fa6a2a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2459.427443] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d40df93b-722b-4801-99e8-eb37756fea70 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2459.434951] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcb1be4f-d646-46f6-b209-879dd0f0761c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2459.448295] env[62277]: DEBUG nova.compute.provider_tree [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2459.457295] env[62277]: DEBUG nova.scheduler.client.report [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2459.470678] env[62277]: DEBUG oslo_concurrency.lockutils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.312s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2459.471227] env[62277]: ERROR nova.compute.manager [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2459.471227] env[62277]: Faults: ['InvalidArgument'] [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Traceback (most recent call last): [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] self.driver.spawn(context, instance, image_meta, [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] self._fetch_image_if_missing(context, vi) [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] image_cache(vi, tmp_image_ds_loc) [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] vm_util.copy_virtual_disk( [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] session._wait_for_task(vmdk_copy_task) [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] return self.wait_for_task(task_ref) [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] return evt.wait() [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] result = hub.switch() [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] return self.greenlet.switch() [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] self.f(*self.args, **self.kw) [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] raise exceptions.translate_fault(task_info.error) [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Faults: ['InvalidArgument'] [ 2459.471227] env[62277]: ERROR nova.compute.manager [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] [ 2459.471914] env[62277]: DEBUG nova.compute.utils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2459.473333] env[62277]: DEBUG nova.compute.manager [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Build of instance ad83bf06-d712-4bd4-8086-9c3b615adaf5 was re-scheduled: A specified parameter was not correct: fileType [ 2459.473333] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2459.473689] env[62277]: DEBUG nova.compute.manager [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2459.473857] env[62277]: DEBUG nova.compute.manager [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2459.474036] env[62277]: DEBUG nova.compute.manager [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2459.474233] env[62277]: DEBUG nova.network.neutron [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2459.785206] env[62277]: DEBUG nova.network.neutron [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2459.798377] env[62277]: INFO nova.compute.manager [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Took 0.32 seconds to deallocate network for instance. [ 2459.891594] env[62277]: INFO nova.scheduler.client.report [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Deleted allocations for instance ad83bf06-d712-4bd4-8086-9c3b615adaf5 [ 2459.912698] env[62277]: DEBUG oslo_concurrency.lockutils [None req-71b000e3-ec1f-4b31-a0c1-8396746d6082 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "ad83bf06-d712-4bd4-8086-9c3b615adaf5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 618.887s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2459.913048] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7028cb85-dee9-4ac8-aa89-05c1fceb35f7 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "ad83bf06-d712-4bd4-8086-9c3b615adaf5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 423.645s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2459.913228] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7028cb85-dee9-4ac8-aa89-05c1fceb35f7 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquiring lock "ad83bf06-d712-4bd4-8086-9c3b615adaf5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2459.913452] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7028cb85-dee9-4ac8-aa89-05c1fceb35f7 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "ad83bf06-d712-4bd4-8086-9c3b615adaf5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2459.913617] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7028cb85-dee9-4ac8-aa89-05c1fceb35f7 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "ad83bf06-d712-4bd4-8086-9c3b615adaf5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2459.915731] env[62277]: INFO nova.compute.manager [None req-7028cb85-dee9-4ac8-aa89-05c1fceb35f7 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Terminating instance [ 2459.917392] env[62277]: DEBUG nova.compute.manager [None req-7028cb85-dee9-4ac8-aa89-05c1fceb35f7 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2459.917593] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7028cb85-dee9-4ac8-aa89-05c1fceb35f7 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2459.918071] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a002d9ba-5838-4c63-8d0a-e1a432cc9ff2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2459.927820] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84b683cd-0133-426e-b355-2077fd915642 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2459.955873] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-7028cb85-dee9-4ac8-aa89-05c1fceb35f7 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ad83bf06-d712-4bd4-8086-9c3b615adaf5 could not be found. [ 2459.956112] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7028cb85-dee9-4ac8-aa89-05c1fceb35f7 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2459.956311] env[62277]: INFO nova.compute.manager [None req-7028cb85-dee9-4ac8-aa89-05c1fceb35f7 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2459.956560] env[62277]: DEBUG oslo.service.loopingcall [None req-7028cb85-dee9-4ac8-aa89-05c1fceb35f7 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2459.957046] env[62277]: DEBUG nova.compute.manager [-] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2459.957152] env[62277]: DEBUG nova.network.neutron [-] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2459.984867] env[62277]: DEBUG nova.network.neutron [-] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2459.992905] env[62277]: INFO nova.compute.manager [-] [instance: ad83bf06-d712-4bd4-8086-9c3b615adaf5] Took 0.04 seconds to deallocate network for instance. [ 2460.076769] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7028cb85-dee9-4ac8-aa89-05c1fceb35f7 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "ad83bf06-d712-4bd4-8086-9c3b615adaf5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.164s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2482.079632] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_power_states {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2482.098708] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Getting list of instances from cluster (obj){ [ 2482.098708] env[62277]: value = "domain-c8" [ 2482.098708] env[62277]: _type = "ClusterComputeResource" [ 2482.098708] env[62277]: } {{(pid=62277) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2482.100470] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b12ba56a-b50c-4c3a-a70f-30b7c055872e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2482.115421] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Got total of 8 instances {{(pid=62277) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2482.115597] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid de543a46-26c3-40b3-9ccd-80bb1f9845d7 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 2482.115799] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 297d53df-7918-4389-9c63-a600755da969 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 2482.115960] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 940561d5-723b-4e43-8fab-35e8af95ce09 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 2482.116130] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 6737c3b9-d9e6-4879-a6df-46d3c7dee40e {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 2482.116309] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 4f26ed27-558d-489a-9141-ec63b6164cc8 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 2482.116471] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 26a7549d-94b4-4113-ab8b-10886eafcd49 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 2482.116619] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 295cb2fd-b409-4d5c-8fef-12b7acd9fec0 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 2482.116767] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Triggering sync for uuid 5683f242-4848-42fa-9353-46982c3a72c0 {{(pid=62277) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10319}} [ 2482.117095] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "de543a46-26c3-40b3-9ccd-80bb1f9845d7" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2482.117339] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "297d53df-7918-4389-9c63-a600755da969" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2482.117543] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "940561d5-723b-4e43-8fab-35e8af95ce09" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2482.117732] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "6737c3b9-d9e6-4879-a6df-46d3c7dee40e" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2482.117930] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "4f26ed27-558d-489a-9141-ec63b6164cc8" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2482.118142] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "26a7549d-94b4-4113-ab8b-10886eafcd49" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2482.118386] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "295cb2fd-b409-4d5c-8fef-12b7acd9fec0" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2482.118508] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "5683f242-4848-42fa-9353-46982c3a72c0" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2482.437130] env[62277]: DEBUG oslo_concurrency.lockutils [None req-339afca7-4cd5-46f6-943c-50429aa5af37 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "295cb2fd-b409-4d5c-8fef-12b7acd9fec0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2492.208249] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2496.169072] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2497.164893] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2500.169498] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2500.181100] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2500.181316] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2500.181481] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2500.181631] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2500.182698] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2c88df7-5785-4a0d-a5a3-5207d139b904 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2500.191289] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95a2ef71-9320-400e-a805-dcfbeff827b1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2500.204569] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4b3ac43-fc73-4764-b2df-e90fe39f4a8e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2500.210396] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4279eb67-1576-4c28-9f8a-e3c97ba2b7fe {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2500.239537] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181399MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2500.239687] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2500.239873] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2500.310705] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance de543a46-26c3-40b3-9ccd-80bb1f9845d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2500.310858] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 297d53df-7918-4389-9c63-a600755da969 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2500.310984] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 940561d5-723b-4e43-8fab-35e8af95ce09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2500.311127] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6737c3b9-d9e6-4879-a6df-46d3c7dee40e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2500.311250] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4f26ed27-558d-489a-9141-ec63b6164cc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2500.311368] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 26a7549d-94b4-4113-ab8b-10886eafcd49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2500.311481] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 295cb2fd-b409-4d5c-8fef-12b7acd9fec0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2500.311597] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 5683f242-4848-42fa-9353-46982c3a72c0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2500.311776] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2500.311909] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2500.402622] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b2e1e3c-0764-4751-892a-e40d49d78188 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2500.410113] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa927146-7b01-49df-9009-55768921ebfa {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2500.439782] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-151c24d8-289f-4516-8d7e-b7040ece51ff {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2500.446414] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48428c04-60f6-445f-b117-8b1fa355fa1f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2500.458987] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2500.467051] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2500.479755] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2500.479925] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.240s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2501.478829] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2502.168808] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2502.168969] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 2502.169077] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 2502.186634] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2502.186836] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 297d53df-7918-4389-9c63-a600755da969] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2502.186972] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2502.187110] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2502.187232] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2502.187353] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2502.187473] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2502.187591] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2502.187709] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 2502.188180] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2503.168452] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2503.168777] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 2505.169971] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2508.027107] env[62277]: WARNING oslo_vmware.rw_handles [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2508.027107] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2508.027107] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2508.027107] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2508.027107] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2508.027107] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 2508.027107] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2508.027107] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2508.027107] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2508.027107] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2508.027107] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2508.027107] env[62277]: ERROR oslo_vmware.rw_handles [ 2508.027717] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/950eb181-6cf3-4c12-b1ce-6b418cef867b/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2508.029658] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2508.029917] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Copying Virtual Disk [datastore2] vmware_temp/950eb181-6cf3-4c12-b1ce-6b418cef867b/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/950eb181-6cf3-4c12-b1ce-6b418cef867b/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2508.030232] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-826a2f6f-2eba-4922-8d3f-42ef9bfba4cc {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2508.039885] env[62277]: DEBUG oslo_vmware.api [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for the task: (returnval){ [ 2508.039885] env[62277]: value = "task-1405522" [ 2508.039885] env[62277]: _type = "Task" [ 2508.039885] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2508.048102] env[62277]: DEBUG oslo_vmware.api [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Task: {'id': task-1405522, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2508.550503] env[62277]: DEBUG oslo_vmware.exceptions [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2508.550913] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2508.551359] env[62277]: ERROR nova.compute.manager [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2508.551359] env[62277]: Faults: ['InvalidArgument'] [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Traceback (most recent call last): [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] yield resources [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] self.driver.spawn(context, instance, image_meta, [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] self._fetch_image_if_missing(context, vi) [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] image_cache(vi, tmp_image_ds_loc) [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] vm_util.copy_virtual_disk( [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] session._wait_for_task(vmdk_copy_task) [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] return self.wait_for_task(task_ref) [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] return evt.wait() [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] result = hub.switch() [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] return self.greenlet.switch() [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] self.f(*self.args, **self.kw) [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] raise exceptions.translate_fault(task_info.error) [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Faults: ['InvalidArgument'] [ 2508.551359] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] [ 2508.552282] env[62277]: INFO nova.compute.manager [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Terminating instance [ 2508.553232] env[62277]: DEBUG oslo_concurrency.lockutils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2508.553439] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2508.553678] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-685b3311-6d71-4fcb-a903-df9b2c0fcada {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2508.556073] env[62277]: DEBUG nova.compute.manager [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2508.556265] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2508.557042] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1242f87-d98d-491e-ace0-5ecb4dd0c15f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2508.564047] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2508.565068] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-673bfce7-d678-4aba-b90b-7ca6ad89f2f0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2508.566464] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2508.566652] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2508.567351] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cde97c58-a95b-41bd-9a1a-57a54548ea9e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2508.572636] env[62277]: DEBUG oslo_vmware.api [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Waiting for the task: (returnval){ [ 2508.572636] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52144aa6-e6b9-1d4c-fe00-746d68f03565" [ 2508.572636] env[62277]: _type = "Task" [ 2508.572636] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2508.587082] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2508.587346] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Creating directory with path [datastore2] vmware_temp/653d931b-0355-40a4-a6f5-44ba8a5fea57/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2508.587582] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6dc00a22-138b-4c0a-8135-3d46a6393387 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2508.609099] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Created directory with path [datastore2] vmware_temp/653d931b-0355-40a4-a6f5-44ba8a5fea57/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2508.609294] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Fetch image to [datastore2] vmware_temp/653d931b-0355-40a4-a6f5-44ba8a5fea57/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2508.609470] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/653d931b-0355-40a4-a6f5-44ba8a5fea57/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2508.610293] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25ca7a7f-d802-44d0-b2d8-06dcf81e6d92 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2508.618725] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b67bf2cf-56ee-4d33-99dc-5248105fd7f8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2508.627748] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d2e6181-1176-48e9-9aa8-98c7f28090f1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2508.632259] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2508.632456] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2508.632638] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Deleting the datastore file [datastore2] de543a46-26c3-40b3-9ccd-80bb1f9845d7 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2508.633214] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-72c5f720-78b0-44a7-9acb-f7b1d431485a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2508.660812] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d15bf5b2-92b5-4cb7-9476-25ade36c24cd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2508.663283] env[62277]: DEBUG oslo_vmware.api [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for the task: (returnval){ [ 2508.663283] env[62277]: value = "task-1405524" [ 2508.663283] env[62277]: _type = "Task" [ 2508.663283] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2508.668203] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-aa8a7f43-5118-43ac-beb2-4f3a3e38899b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2508.672240] env[62277]: DEBUG oslo_vmware.api [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Task: {'id': task-1405524, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2508.688902] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2508.837434] env[62277]: DEBUG oslo_vmware.rw_handles [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/653d931b-0355-40a4-a6f5-44ba8a5fea57/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2508.896751] env[62277]: DEBUG oslo_vmware.rw_handles [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2508.896949] env[62277]: DEBUG oslo_vmware.rw_handles [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/653d931b-0355-40a4-a6f5-44ba8a5fea57/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2509.173661] env[62277]: DEBUG oslo_vmware.api [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Task: {'id': task-1405524, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.116999} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2509.173976] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2509.174097] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2509.174268] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2509.174437] env[62277]: INFO nova.compute.manager [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Took 0.62 seconds to destroy the instance on the hypervisor. [ 2509.176563] env[62277]: DEBUG nova.compute.claims [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2509.176735] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2509.176941] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2509.310591] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e313efc5-7bff-4e71-b0bf-45240c5f7bb0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2509.318369] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c081c003-3e96-410e-a725-54764e7695c9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2509.349438] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33bf6fd0-fb35-4d1e-8178-ee551a756c3d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2509.356534] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e427fa52-69d3-43c2-9cc8-bc6114a39311 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2509.369575] env[62277]: DEBUG nova.compute.provider_tree [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2509.378080] env[62277]: DEBUG nova.scheduler.client.report [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2509.391289] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.214s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2509.391913] env[62277]: ERROR nova.compute.manager [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2509.391913] env[62277]: Faults: ['InvalidArgument'] [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Traceback (most recent call last): [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] self.driver.spawn(context, instance, image_meta, [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] self._fetch_image_if_missing(context, vi) [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] image_cache(vi, tmp_image_ds_loc) [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] vm_util.copy_virtual_disk( [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] session._wait_for_task(vmdk_copy_task) [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] return self.wait_for_task(task_ref) [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] return evt.wait() [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] result = hub.switch() [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] return self.greenlet.switch() [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] self.f(*self.args, **self.kw) [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] raise exceptions.translate_fault(task_info.error) [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Faults: ['InvalidArgument'] [ 2509.391913] env[62277]: ERROR nova.compute.manager [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] [ 2509.393245] env[62277]: DEBUG nova.compute.utils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2509.394087] env[62277]: DEBUG nova.compute.manager [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Build of instance de543a46-26c3-40b3-9ccd-80bb1f9845d7 was re-scheduled: A specified parameter was not correct: fileType [ 2509.394087] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2509.394457] env[62277]: DEBUG nova.compute.manager [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2509.394633] env[62277]: DEBUG nova.compute.manager [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2509.394828] env[62277]: DEBUG nova.compute.manager [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2509.394993] env[62277]: DEBUG nova.network.neutron [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2509.669326] env[62277]: DEBUG nova.network.neutron [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2509.692889] env[62277]: INFO nova.compute.manager [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Took 0.30 seconds to deallocate network for instance. [ 2509.790036] env[62277]: INFO nova.scheduler.client.report [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Deleted allocations for instance de543a46-26c3-40b3-9ccd-80bb1f9845d7 [ 2509.809764] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e7a166-55c1-4c2d-b199-56e6ee5309f5 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "de543a46-26c3-40b3-9ccd-80bb1f9845d7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 616.987s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2509.810464] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e3ae5e-70b8-4ba1-88c0-58a896429b9c tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "de543a46-26c3-40b3-9ccd-80bb1f9845d7" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 420.598s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2509.810464] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e3ae5e-70b8-4ba1-88c0-58a896429b9c tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "de543a46-26c3-40b3-9ccd-80bb1f9845d7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2509.810464] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e3ae5e-70b8-4ba1-88c0-58a896429b9c tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "de543a46-26c3-40b3-9ccd-80bb1f9845d7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2509.810715] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e3ae5e-70b8-4ba1-88c0-58a896429b9c tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "de543a46-26c3-40b3-9ccd-80bb1f9845d7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2509.812515] env[62277]: INFO nova.compute.manager [None req-02e3ae5e-70b8-4ba1-88c0-58a896429b9c tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Terminating instance [ 2509.814166] env[62277]: DEBUG nova.compute.manager [None req-02e3ae5e-70b8-4ba1-88c0-58a896429b9c tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2509.814363] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02e3ae5e-70b8-4ba1-88c0-58a896429b9c tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2509.814847] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-32673496-906b-46f3-b235-c02bcd7e0401 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2509.824014] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23800a3c-37df-4a8a-9648-65e557b36d4e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2509.850550] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-02e3ae5e-70b8-4ba1-88c0-58a896429b9c tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance de543a46-26c3-40b3-9ccd-80bb1f9845d7 could not be found. [ 2509.850716] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-02e3ae5e-70b8-4ba1-88c0-58a896429b9c tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2509.850889] env[62277]: INFO nova.compute.manager [None req-02e3ae5e-70b8-4ba1-88c0-58a896429b9c tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2509.851141] env[62277]: DEBUG oslo.service.loopingcall [None req-02e3ae5e-70b8-4ba1-88c0-58a896429b9c tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2509.851582] env[62277]: DEBUG nova.compute.manager [-] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2509.851683] env[62277]: DEBUG nova.network.neutron [-] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2509.875929] env[62277]: DEBUG nova.network.neutron [-] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2509.883238] env[62277]: INFO nova.compute.manager [-] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] Took 0.03 seconds to deallocate network for instance. [ 2509.971819] env[62277]: DEBUG oslo_concurrency.lockutils [None req-02e3ae5e-70b8-4ba1-88c0-58a896429b9c tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "de543a46-26c3-40b3-9ccd-80bb1f9845d7" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.162s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2509.972657] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "de543a46-26c3-40b3-9ccd-80bb1f9845d7" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 27.856s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2509.972865] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: de543a46-26c3-40b3-9ccd-80bb1f9845d7] During sync_power_state the instance has a pending task (deleting). Skip. [ 2509.973103] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "de543a46-26c3-40b3-9ccd-80bb1f9845d7" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2552.172162] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2557.165359] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2557.168014] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2558.667769] env[62277]: WARNING oslo_vmware.rw_handles [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2558.667769] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2558.667769] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2558.667769] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2558.667769] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2558.667769] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 2558.667769] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2558.667769] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2558.667769] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2558.667769] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2558.667769] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2558.667769] env[62277]: ERROR oslo_vmware.rw_handles [ 2558.668478] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/653d931b-0355-40a4-a6f5-44ba8a5fea57/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2558.670283] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2558.670523] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Copying Virtual Disk [datastore2] vmware_temp/653d931b-0355-40a4-a6f5-44ba8a5fea57/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/653d931b-0355-40a4-a6f5-44ba8a5fea57/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2558.670820] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-51acf76e-a949-41d1-920a-f235093f146b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2558.679121] env[62277]: DEBUG oslo_vmware.api [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Waiting for the task: (returnval){ [ 2558.679121] env[62277]: value = "task-1405525" [ 2558.679121] env[62277]: _type = "Task" [ 2558.679121] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2558.687087] env[62277]: DEBUG oslo_vmware.api [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Task: {'id': task-1405525, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2559.189897] env[62277]: DEBUG oslo_vmware.exceptions [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2559.190191] env[62277]: DEBUG oslo_concurrency.lockutils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2559.190746] env[62277]: ERROR nova.compute.manager [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2559.190746] env[62277]: Faults: ['InvalidArgument'] [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] Traceback (most recent call last): [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] yield resources [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] self.driver.spawn(context, instance, image_meta, [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] self._fetch_image_if_missing(context, vi) [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] image_cache(vi, tmp_image_ds_loc) [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] vm_util.copy_virtual_disk( [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] session._wait_for_task(vmdk_copy_task) [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] return self.wait_for_task(task_ref) [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] return evt.wait() [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] result = hub.switch() [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] return self.greenlet.switch() [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] self.f(*self.args, **self.kw) [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] raise exceptions.translate_fault(task_info.error) [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] Faults: ['InvalidArgument'] [ 2559.190746] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] [ 2559.191620] env[62277]: INFO nova.compute.manager [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Terminating instance [ 2559.192621] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2559.192826] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2559.193089] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-79ea9076-2a98-4ec1-b806-f8a95c532921 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2559.196604] env[62277]: DEBUG nova.compute.manager [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2559.196794] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2559.197576] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d46a4511-f864-4120-aa22-76c7ea273faf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2559.201107] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2559.201310] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2559.202259] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9866fcd6-09c6-4f67-a44c-97960dbbb85a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2559.206122] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2559.206598] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7aeff21f-1c34-4d77-999e-c943d2d77759 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2559.209102] env[62277]: DEBUG oslo_vmware.api [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Waiting for the task: (returnval){ [ 2559.209102] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]524da314-b4b9-9072-6944-10a937739f76" [ 2559.209102] env[62277]: _type = "Task" [ 2559.209102] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2559.216462] env[62277]: DEBUG oslo_vmware.api [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]524da314-b4b9-9072-6944-10a937739f76, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2559.276448] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2559.276653] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2559.276826] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Deleting the datastore file [datastore2] 297d53df-7918-4389-9c63-a600755da969 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2559.277121] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e4aeac99-2c79-4e75-8d37-11cdf9d86ab7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2559.284273] env[62277]: DEBUG oslo_vmware.api [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Waiting for the task: (returnval){ [ 2559.284273] env[62277]: value = "task-1405527" [ 2559.284273] env[62277]: _type = "Task" [ 2559.284273] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2559.291450] env[62277]: DEBUG oslo_vmware.api [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Task: {'id': task-1405527, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2559.719442] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2559.719768] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Creating directory with path [datastore2] vmware_temp/d78ae7f7-f324-4074-890d-79213f00720a/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2559.719877] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3a2576eb-cd68-4740-b6f8-c4c11d49f8f4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2559.730884] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Created directory with path [datastore2] vmware_temp/d78ae7f7-f324-4074-890d-79213f00720a/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2559.731074] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Fetch image to [datastore2] vmware_temp/d78ae7f7-f324-4074-890d-79213f00720a/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2559.731247] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/d78ae7f7-f324-4074-890d-79213f00720a/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2559.731945] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9aaeb645-c82b-4b1d-b91f-b73af8a98592 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2559.738309] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-667de443-c55b-4ed0-bf82-f928e03a2ee6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2559.747888] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4db7b19c-b711-47f6-a5a1-22ba31c2a557 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2559.778150] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4dedc54e-1dc5-4aa3-80c0-613fe256cc0c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2559.783640] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c9abe7d7-0f32-4e61-8890-a61a1cdbbe0f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2559.792638] env[62277]: DEBUG oslo_vmware.api [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Task: {'id': task-1405527, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076997} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2559.792865] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2559.793051] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2559.793228] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2559.793397] env[62277]: INFO nova.compute.manager [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2559.795478] env[62277]: DEBUG nova.compute.claims [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2559.795646] env[62277]: DEBUG oslo_concurrency.lockutils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2559.795851] env[62277]: DEBUG oslo_concurrency.lockutils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2559.808052] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2559.858209] env[62277]: DEBUG oslo_vmware.rw_handles [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d78ae7f7-f324-4074-890d-79213f00720a/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2559.919347] env[62277]: DEBUG oslo_vmware.rw_handles [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2559.919586] env[62277]: DEBUG oslo_vmware.rw_handles [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d78ae7f7-f324-4074-890d-79213f00720a/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2559.977964] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f26d4361-d0f3-44bf-ba9a-4f6515a6e1ea {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2559.985329] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c9234c9-7e15-4c2c-9fb1-87657b493624 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2560.015517] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a99ac55b-d6fc-4d51-9eff-afde500d61ce {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2560.022341] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94126f79-b876-4c60-bc46-e39b47595246 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2560.034844] env[62277]: DEBUG nova.compute.provider_tree [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2560.043740] env[62277]: DEBUG nova.scheduler.client.report [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2560.057169] env[62277]: DEBUG oslo_concurrency.lockutils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.261s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2560.057680] env[62277]: ERROR nova.compute.manager [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2560.057680] env[62277]: Faults: ['InvalidArgument'] [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] Traceback (most recent call last): [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] self.driver.spawn(context, instance, image_meta, [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] self._fetch_image_if_missing(context, vi) [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] image_cache(vi, tmp_image_ds_loc) [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] vm_util.copy_virtual_disk( [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] session._wait_for_task(vmdk_copy_task) [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] return self.wait_for_task(task_ref) [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] return evt.wait() [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] result = hub.switch() [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] return self.greenlet.switch() [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] self.f(*self.args, **self.kw) [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] raise exceptions.translate_fault(task_info.error) [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] Faults: ['InvalidArgument'] [ 2560.057680] env[62277]: ERROR nova.compute.manager [instance: 297d53df-7918-4389-9c63-a600755da969] [ 2560.058497] env[62277]: DEBUG nova.compute.utils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2560.060080] env[62277]: DEBUG nova.compute.manager [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Build of instance 297d53df-7918-4389-9c63-a600755da969 was re-scheduled: A specified parameter was not correct: fileType [ 2560.060080] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2560.060454] env[62277]: DEBUG nova.compute.manager [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2560.060625] env[62277]: DEBUG nova.compute.manager [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2560.060789] env[62277]: DEBUG nova.compute.manager [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2560.060948] env[62277]: DEBUG nova.network.neutron [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2560.358518] env[62277]: DEBUG nova.network.neutron [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2560.369563] env[62277]: INFO nova.compute.manager [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Took 0.31 seconds to deallocate network for instance. [ 2560.457573] env[62277]: INFO nova.scheduler.client.report [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Deleted allocations for instance 297d53df-7918-4389-9c63-a600755da969 [ 2560.477982] env[62277]: DEBUG oslo_concurrency.lockutils [None req-591e8346-1aea-48ee-9503-485e0a5d2231 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "297d53df-7918-4389-9c63-a600755da969" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 579.974s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2560.478267] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a0f67795-2bbf-4c83-a3ce-b2d8fcd83060 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "297d53df-7918-4389-9c63-a600755da969" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 383.480s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2560.478490] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a0f67795-2bbf-4c83-a3ce-b2d8fcd83060 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Acquiring lock "297d53df-7918-4389-9c63-a600755da969-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2560.478690] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a0f67795-2bbf-4c83-a3ce-b2d8fcd83060 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "297d53df-7918-4389-9c63-a600755da969-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2560.479079] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a0f67795-2bbf-4c83-a3ce-b2d8fcd83060 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "297d53df-7918-4389-9c63-a600755da969-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2560.480996] env[62277]: INFO nova.compute.manager [None req-a0f67795-2bbf-4c83-a3ce-b2d8fcd83060 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Terminating instance [ 2560.482654] env[62277]: DEBUG nova.compute.manager [None req-a0f67795-2bbf-4c83-a3ce-b2d8fcd83060 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2560.482853] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a0f67795-2bbf-4c83-a3ce-b2d8fcd83060 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2560.483370] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-66ead43a-cef8-453e-9c0e-c400ed6d4dc0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2560.492794] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9561dd8a-9e89-4b3c-9ddf-1663a0cecf00 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2560.519025] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-a0f67795-2bbf-4c83-a3ce-b2d8fcd83060 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 297d53df-7918-4389-9c63-a600755da969 could not be found. [ 2560.519268] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-a0f67795-2bbf-4c83-a3ce-b2d8fcd83060 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2560.519452] env[62277]: INFO nova.compute.manager [None req-a0f67795-2bbf-4c83-a3ce-b2d8fcd83060 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] [instance: 297d53df-7918-4389-9c63-a600755da969] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2560.519691] env[62277]: DEBUG oslo.service.loopingcall [None req-a0f67795-2bbf-4c83-a3ce-b2d8fcd83060 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2560.520150] env[62277]: DEBUG nova.compute.manager [-] [instance: 297d53df-7918-4389-9c63-a600755da969] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2560.520252] env[62277]: DEBUG nova.network.neutron [-] [instance: 297d53df-7918-4389-9c63-a600755da969] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2560.542794] env[62277]: DEBUG nova.network.neutron [-] [instance: 297d53df-7918-4389-9c63-a600755da969] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2560.550370] env[62277]: INFO nova.compute.manager [-] [instance: 297d53df-7918-4389-9c63-a600755da969] Took 0.03 seconds to deallocate network for instance. [ 2560.645284] env[62277]: DEBUG oslo_concurrency.lockutils [None req-a0f67795-2bbf-4c83-a3ce-b2d8fcd83060 tempest-AttachInterfacesTestJSON-2037203206 tempest-AttachInterfacesTestJSON-2037203206-project-member] Lock "297d53df-7918-4389-9c63-a600755da969" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.167s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2560.646955] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "297d53df-7918-4389-9c63-a600755da969" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 78.530s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2560.647201] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 297d53df-7918-4389-9c63-a600755da969] During sync_power_state the instance has a pending task (deleting). Skip. [ 2560.647386] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "297d53df-7918-4389-9c63-a600755da969" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2561.168524] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2562.170045] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2562.170045] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 2562.170045] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 2562.184753] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2562.184863] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2562.184989] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2562.185150] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2562.185261] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2562.185378] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2562.185495] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 2562.185965] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2562.196258] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2562.196455] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2562.196616] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2562.196761] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2562.197819] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0203844-4382-4a45-b0b6-b0f56390063f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2562.206297] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cfc9153-a9f4-4d66-b2c8-45166df98051 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2562.219845] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42f682cd-2b31-4653-861e-08820860e354 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2562.225676] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85ae18c0-e3e6-44ca-b939-5d670bf6e057 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2562.253489] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181436MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2562.253623] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2562.253804] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2562.306732] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 940561d5-723b-4e43-8fab-35e8af95ce09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2562.306880] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6737c3b9-d9e6-4879-a6df-46d3c7dee40e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2562.307023] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4f26ed27-558d-489a-9141-ec63b6164cc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2562.307178] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 26a7549d-94b4-4113-ab8b-10886eafcd49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2562.307304] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 295cb2fd-b409-4d5c-8fef-12b7acd9fec0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2562.307420] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 5683f242-4848-42fa-9353-46982c3a72c0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2562.307591] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2562.307725] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2562.378491] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48f1043f-a294-4ce7-9729-8b0f350a718a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2562.386085] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a5414bb-a85d-4ac9-ac0a-392a5ce7fb84 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2562.415816] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcb7bb0c-3cf9-44b0-ac72-39c1b8fa5159 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2562.422709] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1b7ad39-c42e-4b07-b787-d09c6c4d54ac {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2562.435312] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2562.442822] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2562.455265] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2562.455428] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.202s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2563.438913] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2565.168772] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2565.169147] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2565.169147] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 2570.164522] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2593.638016] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "6555137f-42d8-4e07-8b1f-b1e431d082ad" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2593.638260] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "6555137f-42d8-4e07-8b1f-b1e431d082ad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2593.649651] env[62277]: DEBUG nova.compute.manager [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2593.703269] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2593.703612] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2593.705949] env[62277]: INFO nova.compute.claims [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2593.838043] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01483b66-2c38-448b-9a11-0f5630d36a15 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2593.845642] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-786dd4ec-1e46-4a79-8fc6-a001789a0565 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2593.874321] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e345f05-c499-42e3-995b-f9cd125afb7b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2593.880870] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ed3a704-2c2d-4851-8501-4f2a1853fd23 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2593.893496] env[62277]: DEBUG nova.compute.provider_tree [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2593.901779] env[62277]: DEBUG nova.scheduler.client.report [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2593.915072] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2593.915541] env[62277]: DEBUG nova.compute.manager [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2593.947230] env[62277]: DEBUG nova.compute.utils [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2593.948873] env[62277]: DEBUG nova.compute.manager [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2593.948873] env[62277]: DEBUG nova.network.neutron [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2593.957306] env[62277]: DEBUG nova.compute.manager [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2594.002745] env[62277]: DEBUG nova.policy [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '696edb47b3844d7499217e84fcf42619', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e7e15898bc784416bdc7fa9a9423726f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 2594.017277] env[62277]: DEBUG nova.compute.manager [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2594.046960] env[62277]: DEBUG nova.virt.hardware [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2594.047218] env[62277]: DEBUG nova.virt.hardware [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2594.047377] env[62277]: DEBUG nova.virt.hardware [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2594.047557] env[62277]: DEBUG nova.virt.hardware [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2594.047703] env[62277]: DEBUG nova.virt.hardware [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2594.047850] env[62277]: DEBUG nova.virt.hardware [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2594.048063] env[62277]: DEBUG nova.virt.hardware [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2594.048225] env[62277]: DEBUG nova.virt.hardware [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2594.048386] env[62277]: DEBUG nova.virt.hardware [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2594.048545] env[62277]: DEBUG nova.virt.hardware [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2594.048723] env[62277]: DEBUG nova.virt.hardware [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2594.049572] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e933a924-81f8-45d1-894f-89849ecbb9d7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2594.058041] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-540c811d-efa9-4ea6-9cf7-d2a53c5210cf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2594.348452] env[62277]: DEBUG nova.network.neutron [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Successfully created port: a7670eb0-0b0a-458c-8ae2-2954e8cd14ab {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2594.908411] env[62277]: DEBUG nova.compute.manager [req-248fa1a5-09e6-42a4-bf60-e2e981b9404a req-703a1263-e51d-43da-8268-62e7d2d50d2b service nova] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Received event network-vif-plugged-a7670eb0-0b0a-458c-8ae2-2954e8cd14ab {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2594.908411] env[62277]: DEBUG oslo_concurrency.lockutils [req-248fa1a5-09e6-42a4-bf60-e2e981b9404a req-703a1263-e51d-43da-8268-62e7d2d50d2b service nova] Acquiring lock "6555137f-42d8-4e07-8b1f-b1e431d082ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2594.908764] env[62277]: DEBUG oslo_concurrency.lockutils [req-248fa1a5-09e6-42a4-bf60-e2e981b9404a req-703a1263-e51d-43da-8268-62e7d2d50d2b service nova] Lock "6555137f-42d8-4e07-8b1f-b1e431d082ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2594.908764] env[62277]: DEBUG oslo_concurrency.lockutils [req-248fa1a5-09e6-42a4-bf60-e2e981b9404a req-703a1263-e51d-43da-8268-62e7d2d50d2b service nova] Lock "6555137f-42d8-4e07-8b1f-b1e431d082ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2594.908840] env[62277]: DEBUG nova.compute.manager [req-248fa1a5-09e6-42a4-bf60-e2e981b9404a req-703a1263-e51d-43da-8268-62e7d2d50d2b service nova] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] No waiting events found dispatching network-vif-plugged-a7670eb0-0b0a-458c-8ae2-2954e8cd14ab {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2594.908968] env[62277]: WARNING nova.compute.manager [req-248fa1a5-09e6-42a4-bf60-e2e981b9404a req-703a1263-e51d-43da-8268-62e7d2d50d2b service nova] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Received unexpected event network-vif-plugged-a7670eb0-0b0a-458c-8ae2-2954e8cd14ab for instance with vm_state building and task_state spawning. [ 2595.671163] env[62277]: DEBUG nova.network.neutron [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Successfully updated port: a7670eb0-0b0a-458c-8ae2-2954e8cd14ab {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2595.681059] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "refresh_cache-6555137f-42d8-4e07-8b1f-b1e431d082ad" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2595.681212] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquired lock "refresh_cache-6555137f-42d8-4e07-8b1f-b1e431d082ad" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2595.681338] env[62277]: DEBUG nova.network.neutron [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2595.712936] env[62277]: DEBUG nova.network.neutron [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2595.859821] env[62277]: DEBUG nova.network.neutron [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Updating instance_info_cache with network_info: [{"id": "a7670eb0-0b0a-458c-8ae2-2954e8cd14ab", "address": "fa:16:3e:20:04:08", "network": {"id": "7efa6c69-4ed6-4615-b77a-53d6e045efc5", "bridge": "br-int", "label": "tempest-ServersTestJSON-132086172-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e7e15898bc784416bdc7fa9a9423726f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b8137fc-f23d-49b1-b19c-3123a5588f34", "external-id": "nsx-vlan-transportzone-709", "segmentation_id": 709, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa7670eb0-0b", "ovs_interfaceid": "a7670eb0-0b0a-458c-8ae2-2954e8cd14ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2595.872201] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Releasing lock "refresh_cache-6555137f-42d8-4e07-8b1f-b1e431d082ad" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2595.872464] env[62277]: DEBUG nova.compute.manager [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Instance network_info: |[{"id": "a7670eb0-0b0a-458c-8ae2-2954e8cd14ab", "address": "fa:16:3e:20:04:08", "network": {"id": "7efa6c69-4ed6-4615-b77a-53d6e045efc5", "bridge": "br-int", "label": "tempest-ServersTestJSON-132086172-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e7e15898bc784416bdc7fa9a9423726f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b8137fc-f23d-49b1-b19c-3123a5588f34", "external-id": "nsx-vlan-transportzone-709", "segmentation_id": 709, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa7670eb0-0b", "ovs_interfaceid": "a7670eb0-0b0a-458c-8ae2-2954e8cd14ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2595.872897] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:20:04:08', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6b8137fc-f23d-49b1-b19c-3123a5588f34', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a7670eb0-0b0a-458c-8ae2-2954e8cd14ab', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2595.880364] env[62277]: DEBUG oslo.service.loopingcall [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2595.880784] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2595.880985] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c87419f6-54b5-41fd-959c-a8bfcb30c3ba {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2595.903039] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2595.903039] env[62277]: value = "task-1405528" [ 2595.903039] env[62277]: _type = "Task" [ 2595.903039] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2595.909901] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405528, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2596.411525] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405528, 'name': CreateVM_Task, 'duration_secs': 0.277587} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2596.411830] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2596.412532] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2596.412700] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2596.412945] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2596.413212] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ce96eb9b-4510-4461-a894-f4c503ad8266 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2596.417337] env[62277]: DEBUG oslo_vmware.api [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Waiting for the task: (returnval){ [ 2596.417337] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52598819-fc69-c5ac-1034-73d05b9a2454" [ 2596.417337] env[62277]: _type = "Task" [ 2596.417337] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2596.424439] env[62277]: DEBUG oslo_vmware.api [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52598819-fc69-c5ac-1034-73d05b9a2454, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2596.928286] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2596.928562] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2596.928810] env[62277]: DEBUG oslo_concurrency.lockutils [None req-9a6d463f-7d89-4567-8c30-3824d5a78c86 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2596.933818] env[62277]: DEBUG nova.compute.manager [req-48c38e19-af3e-44ba-b054-81676479a51d req-98f86263-174f-4e7d-8590-5ef2d528481d service nova] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Received event network-changed-a7670eb0-0b0a-458c-8ae2-2954e8cd14ab {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2596.934009] env[62277]: DEBUG nova.compute.manager [req-48c38e19-af3e-44ba-b054-81676479a51d req-98f86263-174f-4e7d-8590-5ef2d528481d service nova] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Refreshing instance network info cache due to event network-changed-a7670eb0-0b0a-458c-8ae2-2954e8cd14ab. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 2596.934219] env[62277]: DEBUG oslo_concurrency.lockutils [req-48c38e19-af3e-44ba-b054-81676479a51d req-98f86263-174f-4e7d-8590-5ef2d528481d service nova] Acquiring lock "refresh_cache-6555137f-42d8-4e07-8b1f-b1e431d082ad" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2596.934333] env[62277]: DEBUG oslo_concurrency.lockutils [req-48c38e19-af3e-44ba-b054-81676479a51d req-98f86263-174f-4e7d-8590-5ef2d528481d service nova] Acquired lock "refresh_cache-6555137f-42d8-4e07-8b1f-b1e431d082ad" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2596.934506] env[62277]: DEBUG nova.network.neutron [req-48c38e19-af3e-44ba-b054-81676479a51d req-98f86263-174f-4e7d-8590-5ef2d528481d service nova] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Refreshing network info cache for port a7670eb0-0b0a-458c-8ae2-2954e8cd14ab {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2597.409785] env[62277]: DEBUG nova.network.neutron [req-48c38e19-af3e-44ba-b054-81676479a51d req-98f86263-174f-4e7d-8590-5ef2d528481d service nova] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Updated VIF entry in instance network info cache for port a7670eb0-0b0a-458c-8ae2-2954e8cd14ab. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2597.410141] env[62277]: DEBUG nova.network.neutron [req-48c38e19-af3e-44ba-b054-81676479a51d req-98f86263-174f-4e7d-8590-5ef2d528481d service nova] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Updating instance_info_cache with network_info: [{"id": "a7670eb0-0b0a-458c-8ae2-2954e8cd14ab", "address": "fa:16:3e:20:04:08", "network": {"id": "7efa6c69-4ed6-4615-b77a-53d6e045efc5", "bridge": "br-int", "label": "tempest-ServersTestJSON-132086172-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e7e15898bc784416bdc7fa9a9423726f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b8137fc-f23d-49b1-b19c-3123a5588f34", "external-id": "nsx-vlan-transportzone-709", "segmentation_id": 709, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa7670eb0-0b", "ovs_interfaceid": "a7670eb0-0b0a-458c-8ae2-2954e8cd14ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2597.418924] env[62277]: DEBUG oslo_concurrency.lockutils [req-48c38e19-af3e-44ba-b054-81676479a51d req-98f86263-174f-4e7d-8590-5ef2d528481d service nova] Releasing lock "refresh_cache-6555137f-42d8-4e07-8b1f-b1e431d082ad" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2608.062150] env[62277]: WARNING oslo_vmware.rw_handles [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2608.062150] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2608.062150] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2608.062150] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2608.062150] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2608.062150] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 2608.062150] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2608.062150] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2608.062150] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2608.062150] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2608.062150] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2608.062150] env[62277]: ERROR oslo_vmware.rw_handles [ 2608.062719] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/d78ae7f7-f324-4074-890d-79213f00720a/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2608.064771] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2608.065014] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Copying Virtual Disk [datastore2] vmware_temp/d78ae7f7-f324-4074-890d-79213f00720a/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/d78ae7f7-f324-4074-890d-79213f00720a/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2608.065302] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-387e5f99-7324-4b73-9537-f25c21030202 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2608.073256] env[62277]: DEBUG oslo_vmware.api [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Waiting for the task: (returnval){ [ 2608.073256] env[62277]: value = "task-1405529" [ 2608.073256] env[62277]: _type = "Task" [ 2608.073256] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2608.082972] env[62277]: DEBUG oslo_vmware.api [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Task: {'id': task-1405529, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2608.583290] env[62277]: DEBUG oslo_vmware.exceptions [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2608.583557] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2608.584102] env[62277]: ERROR nova.compute.manager [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2608.584102] env[62277]: Faults: ['InvalidArgument'] [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Traceback (most recent call last): [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] yield resources [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] self.driver.spawn(context, instance, image_meta, [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] self._fetch_image_if_missing(context, vi) [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] image_cache(vi, tmp_image_ds_loc) [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] vm_util.copy_virtual_disk( [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] session._wait_for_task(vmdk_copy_task) [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] return self.wait_for_task(task_ref) [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] return evt.wait() [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] result = hub.switch() [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] return self.greenlet.switch() [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] self.f(*self.args, **self.kw) [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] raise exceptions.translate_fault(task_info.error) [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Faults: ['InvalidArgument'] [ 2608.584102] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] [ 2608.584999] env[62277]: INFO nova.compute.manager [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Terminating instance [ 2608.585918] env[62277]: DEBUG oslo_concurrency.lockutils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2608.586135] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2608.586367] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-67c8e0e6-8111-40a0-97c8-a04d3fb04659 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2608.588620] env[62277]: DEBUG nova.compute.manager [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2608.588810] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2608.589503] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33d88444-dac8-4c9b-b5bc-3f2a37f547a2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2608.595869] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2608.596081] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8ff800f7-8d01-4fc9-8174-75169b15f718 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2608.598159] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2608.598325] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2608.599239] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9abc272b-24ed-446c-adb7-11d1bbcef2a2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2608.604020] env[62277]: DEBUG oslo_vmware.api [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Waiting for the task: (returnval){ [ 2608.604020] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52c1d195-65ab-adda-f6cd-f0fa222b62f9" [ 2608.604020] env[62277]: _type = "Task" [ 2608.604020] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2608.610711] env[62277]: DEBUG oslo_vmware.api [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52c1d195-65ab-adda-f6cd-f0fa222b62f9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2608.665663] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2608.665902] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2608.666039] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Deleting the datastore file [datastore2] 940561d5-723b-4e43-8fab-35e8af95ce09 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2608.666314] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-760824de-d900-4695-970b-24ec48bf9827 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2608.672523] env[62277]: DEBUG oslo_vmware.api [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Waiting for the task: (returnval){ [ 2608.672523] env[62277]: value = "task-1405531" [ 2608.672523] env[62277]: _type = "Task" [ 2608.672523] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2608.679817] env[62277]: DEBUG oslo_vmware.api [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Task: {'id': task-1405531, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2609.114644] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2609.115064] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Creating directory with path [datastore2] vmware_temp/a33c96c3-c1fd-4006-9e13-a95662aa44d9/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2609.115111] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fc4e463c-e74a-45aa-87ad-b14d9121b22d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2609.125814] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Created directory with path [datastore2] vmware_temp/a33c96c3-c1fd-4006-9e13-a95662aa44d9/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2609.125991] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Fetch image to [datastore2] vmware_temp/a33c96c3-c1fd-4006-9e13-a95662aa44d9/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2609.126171] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/a33c96c3-c1fd-4006-9e13-a95662aa44d9/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2609.126888] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-001dede4-ee65-4b87-ac51-37c4ff6126ea {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2609.133173] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae2489e7-3769-4f21-9603-2b9b820a4717 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2609.141739] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-399750c5-124a-44c4-97cf-8ef806be9da2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2609.170891] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-292204f2-7325-4e52-bd9f-a8fd2b820657 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2609.182192] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7b1afdbc-a439-48c1-98b0-c196097c3aa4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2609.183778] env[62277]: DEBUG oslo_vmware.api [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Task: {'id': task-1405531, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.060745} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2609.184011] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2609.184192] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2609.184352] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2609.184531] env[62277]: INFO nova.compute.manager [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2609.186878] env[62277]: DEBUG nova.compute.claims [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2609.187057] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2609.187266] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2609.205034] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2609.264834] env[62277]: DEBUG oslo_vmware.rw_handles [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a33c96c3-c1fd-4006-9e13-a95662aa44d9/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2609.323142] env[62277]: DEBUG oslo_vmware.rw_handles [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2609.323334] env[62277]: DEBUG oslo_vmware.rw_handles [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a33c96c3-c1fd-4006-9e13-a95662aa44d9/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2609.379769] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7d3dc2c-68e1-4c56-a511-5e15ae28557c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2609.387289] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-138bf30b-f29b-484d-a479-26e5216e3adb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2609.415857] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e9be9f2-e989-4dbf-8528-9cd7f2da25e2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2609.422847] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6836e228-c649-46bb-abc0-0b667fe17d0c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2609.435453] env[62277]: DEBUG nova.compute.provider_tree [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2609.444017] env[62277]: DEBUG nova.scheduler.client.report [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2609.458685] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.271s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2609.458881] env[62277]: ERROR nova.compute.manager [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2609.458881] env[62277]: Faults: ['InvalidArgument'] [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Traceback (most recent call last): [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] self.driver.spawn(context, instance, image_meta, [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] self._fetch_image_if_missing(context, vi) [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] image_cache(vi, tmp_image_ds_loc) [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] vm_util.copy_virtual_disk( [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] session._wait_for_task(vmdk_copy_task) [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] return self.wait_for_task(task_ref) [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] return evt.wait() [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] result = hub.switch() [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] return self.greenlet.switch() [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] self.f(*self.args, **self.kw) [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] raise exceptions.translate_fault(task_info.error) [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Faults: ['InvalidArgument'] [ 2609.458881] env[62277]: ERROR nova.compute.manager [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] [ 2609.459613] env[62277]: DEBUG nova.compute.utils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2609.461123] env[62277]: DEBUG nova.compute.manager [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Build of instance 940561d5-723b-4e43-8fab-35e8af95ce09 was re-scheduled: A specified parameter was not correct: fileType [ 2609.461123] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2609.461495] env[62277]: DEBUG nova.compute.manager [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2609.461662] env[62277]: DEBUG nova.compute.manager [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2609.461826] env[62277]: DEBUG nova.compute.manager [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2609.461987] env[62277]: DEBUG nova.network.neutron [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2609.792682] env[62277]: DEBUG nova.network.neutron [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2609.805351] env[62277]: INFO nova.compute.manager [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Took 0.34 seconds to deallocate network for instance. [ 2609.902010] env[62277]: INFO nova.scheduler.client.report [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Deleted allocations for instance 940561d5-723b-4e43-8fab-35e8af95ce09 [ 2609.927552] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6c70d2ca-a7b1-4da0-a1cf-e3b34f6522d0 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Lock "940561d5-723b-4e43-8fab-35e8af95ce09" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 595.875s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2609.927925] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6a798d6b-fafe-4058-9ff9-1999ee5fd9e1 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Lock "940561d5-723b-4e43-8fab-35e8af95ce09" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 400.187s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2609.928161] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6a798d6b-fafe-4058-9ff9-1999ee5fd9e1 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Acquiring lock "940561d5-723b-4e43-8fab-35e8af95ce09-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2609.928370] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6a798d6b-fafe-4058-9ff9-1999ee5fd9e1 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Lock "940561d5-723b-4e43-8fab-35e8af95ce09-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2609.928536] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6a798d6b-fafe-4058-9ff9-1999ee5fd9e1 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Lock "940561d5-723b-4e43-8fab-35e8af95ce09-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2609.930648] env[62277]: INFO nova.compute.manager [None req-6a798d6b-fafe-4058-9ff9-1999ee5fd9e1 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Terminating instance [ 2609.932357] env[62277]: DEBUG nova.compute.manager [None req-6a798d6b-fafe-4058-9ff9-1999ee5fd9e1 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2609.932538] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6a798d6b-fafe-4058-9ff9-1999ee5fd9e1 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2609.932995] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6665a83e-6ff6-4a1f-b57d-11378ea4d671 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2609.943673] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-004baa41-fcae-4e84-beb4-e4db4c14b8ab {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2609.969616] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-6a798d6b-fafe-4058-9ff9-1999ee5fd9e1 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 940561d5-723b-4e43-8fab-35e8af95ce09 could not be found. [ 2609.971021] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-6a798d6b-fafe-4058-9ff9-1999ee5fd9e1 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2609.971021] env[62277]: INFO nova.compute.manager [None req-6a798d6b-fafe-4058-9ff9-1999ee5fd9e1 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2609.971021] env[62277]: DEBUG oslo.service.loopingcall [None req-6a798d6b-fafe-4058-9ff9-1999ee5fd9e1 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2609.971021] env[62277]: DEBUG nova.compute.manager [-] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2609.971021] env[62277]: DEBUG nova.network.neutron [-] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2609.999810] env[62277]: DEBUG nova.network.neutron [-] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2610.010928] env[62277]: INFO nova.compute.manager [-] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] Took 0.04 seconds to deallocate network for instance. [ 2610.089720] env[62277]: DEBUG oslo_concurrency.lockutils [None req-6a798d6b-fafe-4058-9ff9-1999ee5fd9e1 tempest-ImagesTestJSON-582743310 tempest-ImagesTestJSON-582743310-project-member] Lock "940561d5-723b-4e43-8fab-35e8af95ce09" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.162s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2610.090793] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "940561d5-723b-4e43-8fab-35e8af95ce09" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 127.973s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2610.090793] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 940561d5-723b-4e43-8fab-35e8af95ce09] During sync_power_state the instance has a pending task (deleting). Skip. [ 2610.090936] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "940561d5-723b-4e43-8fab-35e8af95ce09" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2612.168124] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2618.166023] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2619.169264] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2621.168825] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2622.169652] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2622.181036] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2622.181265] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2622.181433] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2622.181585] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2622.182712] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5647b6e2-05ef-4597-ab3c-36e846758143 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2622.191301] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2a4a1e1-8b57-4ab3-a773-ef951fd29fbf {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2622.204707] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46d8564b-a012-4918-a7ee-0f8e54475a80 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2622.210541] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3eda8ed-10b3-4ed7-b242-c0321677d7cb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2622.239160] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181391MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2622.239299] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2622.239484] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2622.299883] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6737c3b9-d9e6-4879-a6df-46d3c7dee40e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2622.300062] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4f26ed27-558d-489a-9141-ec63b6164cc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2622.300194] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 26a7549d-94b4-4113-ab8b-10886eafcd49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2622.300318] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 295cb2fd-b409-4d5c-8fef-12b7acd9fec0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2622.300441] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 5683f242-4848-42fa-9353-46982c3a72c0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2622.300556] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6555137f-42d8-4e07-8b1f-b1e431d082ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2622.300734] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2622.300869] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2622.376850] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-153e51be-ded4-49b3-b0d7-7acd949bb73d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2622.384197] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6882148-548a-46d5-956b-71b260e77f4d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2622.412597] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ff968f4-1e45-4df7-b513-3543640f2f99 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2622.419891] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24c0b5e6-f98f-435f-b5b7-44208f66a6ed {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2622.433365] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2622.441130] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2622.454640] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2622.454844] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.215s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2623.454310] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2623.454581] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 2623.454622] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 2623.470098] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2623.470245] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2623.470375] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2623.470499] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2623.470620] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2623.470739] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2623.470862] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 2625.167923] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2625.168318] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2626.169456] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2626.169914] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 2633.731084] env[62277]: DEBUG oslo_concurrency.lockutils [None req-bf19f26e-8bc2-42fc-adef-5ec82e233230 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "5683f242-4848-42fa-9353-46982c3a72c0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2657.480857] env[62277]: WARNING oslo_vmware.rw_handles [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2657.480857] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2657.480857] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2657.480857] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2657.480857] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2657.480857] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 2657.480857] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2657.480857] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2657.480857] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2657.480857] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2657.480857] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2657.480857] env[62277]: ERROR oslo_vmware.rw_handles [ 2657.481638] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/a33c96c3-c1fd-4006-9e13-a95662aa44d9/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2657.483258] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2657.483517] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Copying Virtual Disk [datastore2] vmware_temp/a33c96c3-c1fd-4006-9e13-a95662aa44d9/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/a33c96c3-c1fd-4006-9e13-a95662aa44d9/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2657.483807] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a8036b53-ce6c-4fb6-9761-33c8760e7463 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2657.491532] env[62277]: DEBUG oslo_vmware.api [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Waiting for the task: (returnval){ [ 2657.491532] env[62277]: value = "task-1405532" [ 2657.491532] env[62277]: _type = "Task" [ 2657.491532] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2657.499339] env[62277]: DEBUG oslo_vmware.api [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Task: {'id': task-1405532, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2658.002301] env[62277]: DEBUG oslo_vmware.exceptions [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2658.002572] env[62277]: DEBUG oslo_concurrency.lockutils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2658.003148] env[62277]: ERROR nova.compute.manager [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2658.003148] env[62277]: Faults: ['InvalidArgument'] [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Traceback (most recent call last): [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] yield resources [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] self.driver.spawn(context, instance, image_meta, [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] self._fetch_image_if_missing(context, vi) [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] image_cache(vi, tmp_image_ds_loc) [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] vm_util.copy_virtual_disk( [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] session._wait_for_task(vmdk_copy_task) [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] return self.wait_for_task(task_ref) [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] return evt.wait() [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] result = hub.switch() [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] return self.greenlet.switch() [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] self.f(*self.args, **self.kw) [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] raise exceptions.translate_fault(task_info.error) [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Faults: ['InvalidArgument'] [ 2658.003148] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] [ 2658.004251] env[62277]: INFO nova.compute.manager [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Terminating instance [ 2658.004962] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2658.005191] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2658.005431] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a102dc4d-a931-4c28-94c1-7d2e9885e140 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2658.007528] env[62277]: DEBUG nova.compute.manager [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2658.007709] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2658.008440] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e3361bf-957e-4267-bda4-761a635f6bc2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2658.015086] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2658.015295] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ea8758f3-5c99-4a9a-83a7-aa13e3e376b8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2658.017317] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2658.017482] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2658.018398] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d16538e6-f88a-469c-a4a4-a0a7430b3b15 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2658.022869] env[62277]: DEBUG oslo_vmware.api [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Waiting for the task: (returnval){ [ 2658.022869] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52fc8f18-a205-17c3-b925-7ac05d7a1bc1" [ 2658.022869] env[62277]: _type = "Task" [ 2658.022869] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2658.037181] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2658.037401] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Creating directory with path [datastore2] vmware_temp/32240725-c93a-42df-ba23-0bc0df59dfcb/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2658.037603] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-27e42bc9-36f4-4842-9e28-ef4218af1d05 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2658.058089] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Created directory with path [datastore2] vmware_temp/32240725-c93a-42df-ba23-0bc0df59dfcb/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2658.058293] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Fetch image to [datastore2] vmware_temp/32240725-c93a-42df-ba23-0bc0df59dfcb/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2658.058469] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/32240725-c93a-42df-ba23-0bc0df59dfcb/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2658.059215] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d402176-243b-43df-bea1-c72f94ec1e27 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2658.065611] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3a609b1-7e94-4df7-8faa-bba1260a7ef9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2658.075321] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdea008b-9104-46a5-a98f-00725d7ecf63 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2658.105009] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4b2c427-5b20-4247-b3a6-d6a3175b8126 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2658.110501] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-aa34abe7-2991-4e72-a4ed-da75c9843ffb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2658.130345] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2658.178972] env[62277]: DEBUG oslo_vmware.rw_handles [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/32240725-c93a-42df-ba23-0bc0df59dfcb/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2658.238285] env[62277]: DEBUG oslo_vmware.rw_handles [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2658.238548] env[62277]: DEBUG oslo_vmware.rw_handles [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/32240725-c93a-42df-ba23-0bc0df59dfcb/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2658.302735] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2658.302953] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2658.303148] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Deleting the datastore file [datastore2] 6737c3b9-d9e6-4879-a6df-46d3c7dee40e {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2658.303420] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5aad374a-7a61-432d-baa5-07b4785eaaff {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2658.309561] env[62277]: DEBUG oslo_vmware.api [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Waiting for the task: (returnval){ [ 2658.309561] env[62277]: value = "task-1405534" [ 2658.309561] env[62277]: _type = "Task" [ 2658.309561] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2658.317117] env[62277]: DEBUG oslo_vmware.api [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Task: {'id': task-1405534, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2658.819820] env[62277]: DEBUG oslo_vmware.api [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Task: {'id': task-1405534, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066888} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2658.820150] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2658.820269] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2658.820437] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2658.820604] env[62277]: INFO nova.compute.manager [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Took 0.81 seconds to destroy the instance on the hypervisor. [ 2658.822659] env[62277]: DEBUG nova.compute.claims [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2658.822832] env[62277]: DEBUG oslo_concurrency.lockutils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2658.823061] env[62277]: DEBUG oslo_concurrency.lockutils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2658.941720] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b058b85a-8154-4dab-a46e-070ad0131b34 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2658.949241] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a837b87-3c5d-4251-9a78-4611d81d7df1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2658.978712] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20c97dfc-87b2-43fc-8763-3fc58d00ee5e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2658.985628] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8898132-de58-4f23-8ba5-9812e7aca201 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2658.998528] env[62277]: DEBUG nova.compute.provider_tree [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2659.007038] env[62277]: DEBUG nova.scheduler.client.report [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2659.020972] env[62277]: DEBUG oslo_concurrency.lockutils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.198s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2659.021499] env[62277]: ERROR nova.compute.manager [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2659.021499] env[62277]: Faults: ['InvalidArgument'] [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Traceback (most recent call last): [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] self.driver.spawn(context, instance, image_meta, [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] self._fetch_image_if_missing(context, vi) [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] image_cache(vi, tmp_image_ds_loc) [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] vm_util.copy_virtual_disk( [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] session._wait_for_task(vmdk_copy_task) [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] return self.wait_for_task(task_ref) [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] return evt.wait() [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] result = hub.switch() [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] return self.greenlet.switch() [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] self.f(*self.args, **self.kw) [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] raise exceptions.translate_fault(task_info.error) [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Faults: ['InvalidArgument'] [ 2659.021499] env[62277]: ERROR nova.compute.manager [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] [ 2659.022240] env[62277]: DEBUG nova.compute.utils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2659.023579] env[62277]: DEBUG nova.compute.manager [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Build of instance 6737c3b9-d9e6-4879-a6df-46d3c7dee40e was re-scheduled: A specified parameter was not correct: fileType [ 2659.023579] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2659.023945] env[62277]: DEBUG nova.compute.manager [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2659.024130] env[62277]: DEBUG nova.compute.manager [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2659.024304] env[62277]: DEBUG nova.compute.manager [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2659.024463] env[62277]: DEBUG nova.network.neutron [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2659.374840] env[62277]: DEBUG nova.network.neutron [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2659.385525] env[62277]: INFO nova.compute.manager [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Took 0.36 seconds to deallocate network for instance. [ 2659.472724] env[62277]: INFO nova.scheduler.client.report [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Deleted allocations for instance 6737c3b9-d9e6-4879-a6df-46d3c7dee40e [ 2659.488418] env[62277]: DEBUG oslo_concurrency.lockutils [None req-70db0d9a-5bea-4a51-848d-839cf8e4bfec tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "6737c3b9-d9e6-4879-a6df-46d3c7dee40e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 614.002s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2659.488682] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3c48291b-c414-44de-89fc-af67fcfe25e8 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "6737c3b9-d9e6-4879-a6df-46d3c7dee40e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 417.815s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2659.488901] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3c48291b-c414-44de-89fc-af67fcfe25e8 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "6737c3b9-d9e6-4879-a6df-46d3c7dee40e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2659.489133] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3c48291b-c414-44de-89fc-af67fcfe25e8 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "6737c3b9-d9e6-4879-a6df-46d3c7dee40e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2659.489300] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3c48291b-c414-44de-89fc-af67fcfe25e8 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "6737c3b9-d9e6-4879-a6df-46d3c7dee40e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2659.491379] env[62277]: INFO nova.compute.manager [None req-3c48291b-c414-44de-89fc-af67fcfe25e8 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Terminating instance [ 2659.493321] env[62277]: DEBUG nova.compute.manager [None req-3c48291b-c414-44de-89fc-af67fcfe25e8 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2659.493455] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3c48291b-c414-44de-89fc-af67fcfe25e8 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2659.494025] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-61e736fd-df69-429d-8adb-04a188c301f7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2659.502557] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c06c3fcf-8cb5-4f83-8ab6-e7dbb6a1096f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2659.527878] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-3c48291b-c414-44de-89fc-af67fcfe25e8 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6737c3b9-d9e6-4879-a6df-46d3c7dee40e could not be found. [ 2659.528150] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3c48291b-c414-44de-89fc-af67fcfe25e8 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2659.528339] env[62277]: INFO nova.compute.manager [None req-3c48291b-c414-44de-89fc-af67fcfe25e8 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Took 0.03 seconds to destroy the instance on the hypervisor. [ 2659.528581] env[62277]: DEBUG oslo.service.loopingcall [None req-3c48291b-c414-44de-89fc-af67fcfe25e8 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2659.528797] env[62277]: DEBUG nova.compute.manager [-] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2659.528894] env[62277]: DEBUG nova.network.neutron [-] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2659.551119] env[62277]: DEBUG nova.network.neutron [-] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2659.559044] env[62277]: INFO nova.compute.manager [-] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] Took 0.03 seconds to deallocate network for instance. [ 2659.660625] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3c48291b-c414-44de-89fc-af67fcfe25e8 tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "6737c3b9-d9e6-4879-a6df-46d3c7dee40e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.172s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2659.661445] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "6737c3b9-d9e6-4879-a6df-46d3c7dee40e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 177.544s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2659.661627] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6737c3b9-d9e6-4879-a6df-46d3c7dee40e] During sync_power_state the instance has a pending task (deleting). Skip. [ 2659.661792] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "6737c3b9-d9e6-4879-a6df-46d3c7dee40e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2672.169177] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2678.163904] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2679.824105] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "98c0b8a1-7f1d-4b48-b855-97abc6e015a5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2679.824420] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "98c0b8a1-7f1d-4b48-b855-97abc6e015a5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2679.834941] env[62277]: DEBUG nova.compute.manager [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2679.881709] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2679.882036] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2679.884057] env[62277]: INFO nova.compute.claims [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2680.007028] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c46ff07-8edd-4e94-8d87-0dee2906397c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2680.014408] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8444804e-e6fb-4cf8-8e28-7ed70097e7ce {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2680.042683] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84bf2c0d-2dd2-423d-84e0-9e1489343da2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2680.049106] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70a37f7d-df5f-43ad-9afb-9463ec141fbe {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2680.062697] env[62277]: DEBUG nova.compute.provider_tree [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2680.070867] env[62277]: DEBUG nova.scheduler.client.report [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2680.083658] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.202s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2680.084104] env[62277]: DEBUG nova.compute.manager [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2680.116396] env[62277]: DEBUG nova.compute.utils [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2680.117537] env[62277]: DEBUG nova.compute.manager [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2680.117714] env[62277]: DEBUG nova.network.neutron [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2680.126049] env[62277]: DEBUG nova.compute.manager [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2680.168587] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2680.187477] env[62277]: DEBUG nova.compute.manager [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2680.213421] env[62277]: DEBUG nova.virt.hardware [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2680.213640] env[62277]: DEBUG nova.virt.hardware [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2680.213796] env[62277]: DEBUG nova.virt.hardware [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2680.214141] env[62277]: DEBUG nova.virt.hardware [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2680.214976] env[62277]: DEBUG nova.virt.hardware [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2680.214976] env[62277]: DEBUG nova.virt.hardware [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2680.214976] env[62277]: DEBUG nova.virt.hardware [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2680.214976] env[62277]: DEBUG nova.virt.hardware [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2680.215181] env[62277]: DEBUG nova.virt.hardware [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2680.215181] env[62277]: DEBUG nova.virt.hardware [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2680.215364] env[62277]: DEBUG nova.virt.hardware [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2680.216224] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5ed1630-f07e-4177-9321-af2ec251fbc8 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2680.224538] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ce946e3-b2d5-493e-8a75-7df248d04e49 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2680.248595] env[62277]: DEBUG nova.policy [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '013359a6ab0644799bb338125a970c37', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '47f21dc2b2ad4fe692324779a4a84760', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 2680.533739] env[62277]: DEBUG nova.network.neutron [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Successfully created port: d84d8d5d-ae02-47f1-be0b-7334324781b0 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2681.234110] env[62277]: DEBUG nova.compute.manager [req-64c9d6dc-e0d6-4c3b-9dd6-b6fe24a57298 req-e0e13085-5654-4c46-bda4-6771668a3c06 service nova] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Received event network-vif-plugged-d84d8d5d-ae02-47f1-be0b-7334324781b0 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2681.234358] env[62277]: DEBUG oslo_concurrency.lockutils [req-64c9d6dc-e0d6-4c3b-9dd6-b6fe24a57298 req-e0e13085-5654-4c46-bda4-6771668a3c06 service nova] Acquiring lock "98c0b8a1-7f1d-4b48-b855-97abc6e015a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2681.234547] env[62277]: DEBUG oslo_concurrency.lockutils [req-64c9d6dc-e0d6-4c3b-9dd6-b6fe24a57298 req-e0e13085-5654-4c46-bda4-6771668a3c06 service nova] Lock "98c0b8a1-7f1d-4b48-b855-97abc6e015a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2681.234716] env[62277]: DEBUG oslo_concurrency.lockutils [req-64c9d6dc-e0d6-4c3b-9dd6-b6fe24a57298 req-e0e13085-5654-4c46-bda4-6771668a3c06 service nova] Lock "98c0b8a1-7f1d-4b48-b855-97abc6e015a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2681.234880] env[62277]: DEBUG nova.compute.manager [req-64c9d6dc-e0d6-4c3b-9dd6-b6fe24a57298 req-e0e13085-5654-4c46-bda4-6771668a3c06 service nova] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] No waiting events found dispatching network-vif-plugged-d84d8d5d-ae02-47f1-be0b-7334324781b0 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2681.235100] env[62277]: WARNING nova.compute.manager [req-64c9d6dc-e0d6-4c3b-9dd6-b6fe24a57298 req-e0e13085-5654-4c46-bda4-6771668a3c06 service nova] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Received unexpected event network-vif-plugged-d84d8d5d-ae02-47f1-be0b-7334324781b0 for instance with vm_state building and task_state spawning. [ 2681.330157] env[62277]: DEBUG nova.network.neutron [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Successfully updated port: d84d8d5d-ae02-47f1-be0b-7334324781b0 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2681.349450] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "refresh_cache-98c0b8a1-7f1d-4b48-b855-97abc6e015a5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2681.349611] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquired lock "refresh_cache-98c0b8a1-7f1d-4b48-b855-97abc6e015a5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2681.349763] env[62277]: DEBUG nova.network.neutron [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2681.406888] env[62277]: DEBUG nova.network.neutron [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2681.556551] env[62277]: DEBUG nova.network.neutron [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Updating instance_info_cache with network_info: [{"id": "d84d8d5d-ae02-47f1-be0b-7334324781b0", "address": "fa:16:3e:60:cd:45", "network": {"id": "0fdd7696-a8f7-4fbe-88a6-4c5254049911", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-706519722-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "47f21dc2b2ad4fe692324779a4a84760", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7150f662-0cf1-44f9-ae14-d70f479649b6", "external-id": "nsx-vlan-transportzone-712", "segmentation_id": 712, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd84d8d5d-ae", "ovs_interfaceid": "d84d8d5d-ae02-47f1-be0b-7334324781b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2681.567783] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Releasing lock "refresh_cache-98c0b8a1-7f1d-4b48-b855-97abc6e015a5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2681.568083] env[62277]: DEBUG nova.compute.manager [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Instance network_info: |[{"id": "d84d8d5d-ae02-47f1-be0b-7334324781b0", "address": "fa:16:3e:60:cd:45", "network": {"id": "0fdd7696-a8f7-4fbe-88a6-4c5254049911", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-706519722-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "47f21dc2b2ad4fe692324779a4a84760", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7150f662-0cf1-44f9-ae14-d70f479649b6", "external-id": "nsx-vlan-transportzone-712", "segmentation_id": 712, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd84d8d5d-ae", "ovs_interfaceid": "d84d8d5d-ae02-47f1-be0b-7334324781b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2681.568503] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:60:cd:45', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7150f662-0cf1-44f9-ae14-d70f479649b6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd84d8d5d-ae02-47f1-be0b-7334324781b0', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2681.575986] env[62277]: DEBUG oslo.service.loopingcall [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2681.576459] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2681.576707] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-12b11d9e-f91e-4acd-86a8-1c5a4451d09a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2681.596810] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2681.596810] env[62277]: value = "task-1405535" [ 2681.596810] env[62277]: _type = "Task" [ 2681.596810] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2681.604408] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405535, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2682.108998] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405535, 'name': CreateVM_Task, 'duration_secs': 0.300348} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2682.109207] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2682.109877] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2682.110046] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2682.110389] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2682.110637] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b8b66b91-9c27-493d-807c-cb22ac2d60fe {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2682.114933] env[62277]: DEBUG oslo_vmware.api [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for the task: (returnval){ [ 2682.114933] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]522b4738-2a05-07df-141d-d122ccbe90bb" [ 2682.114933] env[62277]: _type = "Task" [ 2682.114933] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2682.122305] env[62277]: DEBUG oslo_vmware.api [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]522b4738-2a05-07df-141d-d122ccbe90bb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2682.625311] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2682.625675] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2682.625757] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8857ebee-692c-4d19-81d7-638db29ea98d tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2683.168988] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2683.169258] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 2683.169323] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 2683.185315] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2683.185460] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2683.185595] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2683.185719] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2683.185842] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2683.185962] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2683.186092] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 2683.186528] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2683.186714] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2683.196922] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2683.197129] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2683.197306] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2683.197464] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2683.198508] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-139b8b47-a1e5-4896-86ba-1a1469ae068b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2683.206961] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c75afc69-cc13-44d6-9b42-ef4e66393249 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2683.220379] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a697c550-fd70-4bbb-9404-c700d5a3f06a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2683.226246] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58717ef5-5261-4141-ae33-14d662b02558 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2683.254341] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181432MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2683.254499] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2683.254693] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2683.263637] env[62277]: DEBUG nova.compute.manager [req-de4d5997-c773-4c02-b5da-8751bd7294fa req-92ad0e50-dca5-43c2-ad97-69ea1e3060e6 service nova] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Received event network-changed-d84d8d5d-ae02-47f1-be0b-7334324781b0 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2683.263869] env[62277]: DEBUG nova.compute.manager [req-de4d5997-c773-4c02-b5da-8751bd7294fa req-92ad0e50-dca5-43c2-ad97-69ea1e3060e6 service nova] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Refreshing instance network info cache due to event network-changed-d84d8d5d-ae02-47f1-be0b-7334324781b0. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 2683.264046] env[62277]: DEBUG oslo_concurrency.lockutils [req-de4d5997-c773-4c02-b5da-8751bd7294fa req-92ad0e50-dca5-43c2-ad97-69ea1e3060e6 service nova] Acquiring lock "refresh_cache-98c0b8a1-7f1d-4b48-b855-97abc6e015a5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2683.264196] env[62277]: DEBUG oslo_concurrency.lockutils [req-de4d5997-c773-4c02-b5da-8751bd7294fa req-92ad0e50-dca5-43c2-ad97-69ea1e3060e6 service nova] Acquired lock "refresh_cache-98c0b8a1-7f1d-4b48-b855-97abc6e015a5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2683.264388] env[62277]: DEBUG nova.network.neutron [req-de4d5997-c773-4c02-b5da-8751bd7294fa req-92ad0e50-dca5-43c2-ad97-69ea1e3060e6 service nova] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Refreshing network info cache for port d84d8d5d-ae02-47f1-be0b-7334324781b0 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2683.313702] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 4f26ed27-558d-489a-9141-ec63b6164cc8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2683.313867] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 26a7549d-94b4-4113-ab8b-10886eafcd49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2683.313996] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 295cb2fd-b409-4d5c-8fef-12b7acd9fec0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2683.314129] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 5683f242-4848-42fa-9353-46982c3a72c0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2683.314246] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6555137f-42d8-4e07-8b1f-b1e431d082ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2683.314361] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 98c0b8a1-7f1d-4b48-b855-97abc6e015a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2683.314537] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2683.314670] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2683.391328] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b9cd25c-2051-4778-9e0c-8700560ec5c3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2683.398985] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82a6d175-7a9b-4d31-8a36-5f1dae3947c9 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2683.430975] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f23a8e2-b126-4925-93c4-4bb7a31500e4 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2683.438070] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d5ac86b-75fb-400f-a47e-4ae2b015d224 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2683.451798] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2683.459526] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2683.475327] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2683.475505] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2683.520751] env[62277]: DEBUG nova.network.neutron [req-de4d5997-c773-4c02-b5da-8751bd7294fa req-92ad0e50-dca5-43c2-ad97-69ea1e3060e6 service nova] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Updated VIF entry in instance network info cache for port d84d8d5d-ae02-47f1-be0b-7334324781b0. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2683.521140] env[62277]: DEBUG nova.network.neutron [req-de4d5997-c773-4c02-b5da-8751bd7294fa req-92ad0e50-dca5-43c2-ad97-69ea1e3060e6 service nova] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Updating instance_info_cache with network_info: [{"id": "d84d8d5d-ae02-47f1-be0b-7334324781b0", "address": "fa:16:3e:60:cd:45", "network": {"id": "0fdd7696-a8f7-4fbe-88a6-4c5254049911", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-706519722-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "47f21dc2b2ad4fe692324779a4a84760", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7150f662-0cf1-44f9-ae14-d70f479649b6", "external-id": "nsx-vlan-transportzone-712", "segmentation_id": 712, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd84d8d5d-ae", "ovs_interfaceid": "d84d8d5d-ae02-47f1-be0b-7334324781b0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2683.529557] env[62277]: DEBUG oslo_concurrency.lockutils [req-de4d5997-c773-4c02-b5da-8751bd7294fa req-92ad0e50-dca5-43c2-ad97-69ea1e3060e6 service nova] Releasing lock "refresh_cache-98c0b8a1-7f1d-4b48-b855-97abc6e015a5" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2685.456898] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2686.169309] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2686.169512] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 2687.169630] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2694.164626] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2707.496472] env[62277]: WARNING oslo_vmware.rw_handles [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2707.496472] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2707.496472] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2707.496472] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2707.496472] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2707.496472] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 2707.496472] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2707.496472] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2707.496472] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2707.496472] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2707.496472] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2707.496472] env[62277]: ERROR oslo_vmware.rw_handles [ 2707.497126] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/32240725-c93a-42df-ba23-0bc0df59dfcb/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2707.498877] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2707.499128] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Copying Virtual Disk [datastore2] vmware_temp/32240725-c93a-42df-ba23-0bc0df59dfcb/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/32240725-c93a-42df-ba23-0bc0df59dfcb/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2707.499408] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1a25045b-5daf-4563-90f8-71f05ff59a29 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2707.507044] env[62277]: DEBUG oslo_vmware.api [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Waiting for the task: (returnval){ [ 2707.507044] env[62277]: value = "task-1405536" [ 2707.507044] env[62277]: _type = "Task" [ 2707.507044] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2707.514967] env[62277]: DEBUG oslo_vmware.api [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Task: {'id': task-1405536, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2708.017589] env[62277]: DEBUG oslo_vmware.exceptions [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2708.017870] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2708.018435] env[62277]: ERROR nova.compute.manager [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2708.018435] env[62277]: Faults: ['InvalidArgument'] [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Traceback (most recent call last): [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] yield resources [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] self.driver.spawn(context, instance, image_meta, [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] self._fetch_image_if_missing(context, vi) [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] image_cache(vi, tmp_image_ds_loc) [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] vm_util.copy_virtual_disk( [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] session._wait_for_task(vmdk_copy_task) [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] return self.wait_for_task(task_ref) [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] return evt.wait() [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] result = hub.switch() [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] return self.greenlet.switch() [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] self.f(*self.args, **self.kw) [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] raise exceptions.translate_fault(task_info.error) [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Faults: ['InvalidArgument'] [ 2708.018435] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] [ 2708.019728] env[62277]: INFO nova.compute.manager [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Terminating instance [ 2708.020298] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2708.020502] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2708.020730] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6e173361-a2a0-4759-9cbe-520dc1602b9f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2708.022773] env[62277]: DEBUG nova.compute.manager [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2708.022962] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2708.023664] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-137062ec-9f6f-4964-b10e-2b0d9784969c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2708.030031] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2708.030244] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8529e026-6ced-4b7f-9edc-6081243da033 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2708.032212] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2708.032380] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2708.033302] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1c46decb-731c-441a-a106-bcd8c9b687fa {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2708.037803] env[62277]: DEBUG oslo_vmware.api [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Waiting for the task: (returnval){ [ 2708.037803] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5270560a-dca2-e635-c2e0-6e86124ba5e7" [ 2708.037803] env[62277]: _type = "Task" [ 2708.037803] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2708.045553] env[62277]: DEBUG oslo_vmware.api [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]5270560a-dca2-e635-c2e0-6e86124ba5e7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2708.091218] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2708.091439] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2708.091618] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Deleting the datastore file [datastore2] 4f26ed27-558d-489a-9141-ec63b6164cc8 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2708.091881] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-bef3c5e8-9c2e-4291-9c41-24b3bf2a8f42 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2708.097654] env[62277]: DEBUG oslo_vmware.api [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Waiting for the task: (returnval){ [ 2708.097654] env[62277]: value = "task-1405538" [ 2708.097654] env[62277]: _type = "Task" [ 2708.097654] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2708.104869] env[62277]: DEBUG oslo_vmware.api [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Task: {'id': task-1405538, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2708.547481] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2708.547806] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Creating directory with path [datastore2] vmware_temp/ad66ac73-b499-4794-8e28-f68cd946313f/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2708.548159] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c28ef73b-6307-45ed-9351-1927cd3c1cc7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2708.559357] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Created directory with path [datastore2] vmware_temp/ad66ac73-b499-4794-8e28-f68cd946313f/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2708.559541] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Fetch image to [datastore2] vmware_temp/ad66ac73-b499-4794-8e28-f68cd946313f/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2708.559711] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/ad66ac73-b499-4794-8e28-f68cd946313f/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2708.560431] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d6f7951-f0ae-4fa3-b9d2-7fd785d1087e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2708.566570] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-408dcc3e-fae7-4b77-9010-36ef1ccfb12b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2708.575304] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac6141e5-adbd-4214-9d84-e193db401278 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2708.609110] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de266b74-6278-4283-8bee-8d0da0367b41 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2708.616226] env[62277]: DEBUG oslo_vmware.api [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Task: {'id': task-1405538, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076975} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2708.617667] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2708.617861] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2708.618040] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2708.618214] env[62277]: INFO nova.compute.manager [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2708.619918] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fa8d47c6-7567-48f2-9260-d2a7285e7608 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2708.621920] env[62277]: DEBUG nova.compute.claims [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2708.622105] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2708.622315] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2708.641634] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2708.692297] env[62277]: DEBUG oslo_vmware.rw_handles [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ad66ac73-b499-4794-8e28-f68cd946313f/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2708.751818] env[62277]: DEBUG oslo_vmware.rw_handles [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2708.752019] env[62277]: DEBUG oslo_vmware.rw_handles [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ad66ac73-b499-4794-8e28-f68cd946313f/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2708.790194] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65cbf212-da33-448b-9fbf-52d11197d172 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2708.797705] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c2c0d5e-de0a-4df0-9d2e-338bc67786ba {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2708.827228] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3db1dae-3cd3-47e4-b169-4e7795d45bfd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2708.835230] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da996ba5-5b87-4b17-87a3-e25a33bec95f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2708.847733] env[62277]: DEBUG nova.compute.provider_tree [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2708.855856] env[62277]: DEBUG nova.scheduler.client.report [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2708.868763] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.246s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2708.869247] env[62277]: ERROR nova.compute.manager [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2708.869247] env[62277]: Faults: ['InvalidArgument'] [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Traceback (most recent call last): [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] self.driver.spawn(context, instance, image_meta, [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] self._fetch_image_if_missing(context, vi) [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] image_cache(vi, tmp_image_ds_loc) [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] vm_util.copy_virtual_disk( [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] session._wait_for_task(vmdk_copy_task) [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] return self.wait_for_task(task_ref) [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] return evt.wait() [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] result = hub.switch() [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] return self.greenlet.switch() [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] self.f(*self.args, **self.kw) [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] raise exceptions.translate_fault(task_info.error) [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Faults: ['InvalidArgument'] [ 2708.869247] env[62277]: ERROR nova.compute.manager [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] [ 2708.870106] env[62277]: DEBUG nova.compute.utils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2708.871221] env[62277]: DEBUG nova.compute.manager [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Build of instance 4f26ed27-558d-489a-9141-ec63b6164cc8 was re-scheduled: A specified parameter was not correct: fileType [ 2708.871221] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2708.871586] env[62277]: DEBUG nova.compute.manager [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2708.871755] env[62277]: DEBUG nova.compute.manager [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2708.871919] env[62277]: DEBUG nova.compute.manager [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2708.872091] env[62277]: DEBUG nova.network.neutron [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2709.342066] env[62277]: DEBUG nova.network.neutron [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2709.352048] env[62277]: INFO nova.compute.manager [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Took 0.48 seconds to deallocate network for instance. [ 2709.440429] env[62277]: INFO nova.scheduler.client.report [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Deleted allocations for instance 4f26ed27-558d-489a-9141-ec63b6164cc8 [ 2709.462304] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7f720ecc-8b47-40a7-8f35-fa3fa6dfc645 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "4f26ed27-558d-489a-9141-ec63b6164cc8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 508.642s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2709.462553] env[62277]: DEBUG oslo_concurrency.lockutils [None req-97cb2248-d5b7-4933-b0e6-6c69a74f9c0c tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "4f26ed27-558d-489a-9141-ec63b6164cc8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 312.752s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2709.462770] env[62277]: DEBUG oslo_concurrency.lockutils [None req-97cb2248-d5b7-4933-b0e6-6c69a74f9c0c tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "4f26ed27-558d-489a-9141-ec63b6164cc8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2709.462971] env[62277]: DEBUG oslo_concurrency.lockutils [None req-97cb2248-d5b7-4933-b0e6-6c69a74f9c0c tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "4f26ed27-558d-489a-9141-ec63b6164cc8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2709.463151] env[62277]: DEBUG oslo_concurrency.lockutils [None req-97cb2248-d5b7-4933-b0e6-6c69a74f9c0c tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "4f26ed27-558d-489a-9141-ec63b6164cc8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2709.465034] env[62277]: INFO nova.compute.manager [None req-97cb2248-d5b7-4933-b0e6-6c69a74f9c0c tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Terminating instance [ 2709.466779] env[62277]: DEBUG nova.compute.manager [None req-97cb2248-d5b7-4933-b0e6-6c69a74f9c0c tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2709.466941] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-97cb2248-d5b7-4933-b0e6-6c69a74f9c0c tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2709.467407] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e2a5bd13-e27f-46b0-bd33-773d7c33a867 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2709.476440] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38c58497-09d2-4d55-a777-35ff76f4da7c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2709.502448] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-97cb2248-d5b7-4933-b0e6-6c69a74f9c0c tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4f26ed27-558d-489a-9141-ec63b6164cc8 could not be found. [ 2709.502640] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-97cb2248-d5b7-4933-b0e6-6c69a74f9c0c tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2709.502811] env[62277]: INFO nova.compute.manager [None req-97cb2248-d5b7-4933-b0e6-6c69a74f9c0c tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2709.503065] env[62277]: DEBUG oslo.service.loopingcall [None req-97cb2248-d5b7-4933-b0e6-6c69a74f9c0c tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2709.503519] env[62277]: DEBUG nova.compute.manager [-] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2709.503616] env[62277]: DEBUG nova.network.neutron [-] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2709.532581] env[62277]: DEBUG nova.network.neutron [-] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2709.540701] env[62277]: INFO nova.compute.manager [-] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] Took 0.04 seconds to deallocate network for instance. [ 2709.623626] env[62277]: DEBUG oslo_concurrency.lockutils [None req-97cb2248-d5b7-4933-b0e6-6c69a74f9c0c tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Lock "4f26ed27-558d-489a-9141-ec63b6164cc8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.161s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2709.624784] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "4f26ed27-558d-489a-9141-ec63b6164cc8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 227.507s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2709.624870] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 4f26ed27-558d-489a-9141-ec63b6164cc8] During sync_power_state the instance has a pending task (deleting). Skip. [ 2709.625182] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "4f26ed27-558d-489a-9141-ec63b6164cc8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2732.170928] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2736.169276] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2736.169608] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Cleaning up deleted instances with incomplete migration {{(pid=62277) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11234}} [ 2737.177089] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2737.177401] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Cleaning up deleted instances {{(pid=62277) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11196}} [ 2737.186497] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] There are 0 instances to clean {{(pid=62277) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11205}} [ 2740.172694] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2742.168730] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2744.168208] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2744.168533] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 2744.168533] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 2744.185195] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2744.185351] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2744.185486] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2744.185614] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2744.185740] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2744.185887] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 2744.186371] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2744.196826] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2744.197053] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2744.197223] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2744.197372] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2744.198489] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0b95da9-a1a9-4651-8ee9-72ebb71d87d2 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2744.208103] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f800d8d-f015-4a5c-9378-4485602564fd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2744.221672] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5948a7a7-74b1-4390-8833-8e8cca63b9a6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2744.227597] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b9fd16d-12f0-4bf0-943f-a59890ad9aa1 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2744.255700] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181429MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2744.255861] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2744.256054] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2744.334781] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 26a7549d-94b4-4113-ab8b-10886eafcd49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2744.334949] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 295cb2fd-b409-4d5c-8fef-12b7acd9fec0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2744.335095] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 5683f242-4848-42fa-9353-46982c3a72c0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2744.335218] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6555137f-42d8-4e07-8b1f-b1e431d082ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2744.335337] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 98c0b8a1-7f1d-4b48-b855-97abc6e015a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2744.335527] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2744.335667] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2744.350953] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Refreshing inventories for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2744.365454] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Updating ProviderTree inventory for provider 75e125ea-a599-4b65-b9cd-6ea881735292 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2744.365628] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Updating inventory in ProviderTree for provider 75e125ea-a599-4b65-b9cd-6ea881735292 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2744.375687] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Refreshing aggregate associations for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292, aggregates: None {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2744.395389] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Refreshing trait associations for resource provider 75e125ea-a599-4b65-b9cd-6ea881735292, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=62277) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2744.460122] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59a22120-28ef-4381-8b50-8996b9fec173 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2744.467609] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7845fa7-9c09-4287-bae6-185890333647 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2744.498041] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b2249a7-c818-4c13-a843-17799431f1d3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2744.504779] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df473efc-f9a8-4dce-ba53-44e3be1b2a2e {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2744.517489] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2744.525188] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2744.538244] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2744.538412] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.282s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2745.520498] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2746.168475] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2747.168786] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2748.168895] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2748.169242] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 2758.738058] env[62277]: WARNING oslo_vmware.rw_handles [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2758.738058] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2758.738058] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2758.738058] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2758.738058] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2758.738058] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 2758.738058] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2758.738058] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2758.738058] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2758.738058] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2758.738058] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2758.738058] env[62277]: ERROR oslo_vmware.rw_handles [ 2758.738058] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/ad66ac73-b499-4794-8e28-f68cd946313f/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2758.738953] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2758.739247] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Copying Virtual Disk [datastore2] vmware_temp/ad66ac73-b499-4794-8e28-f68cd946313f/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/ad66ac73-b499-4794-8e28-f68cd946313f/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2758.739542] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3645db51-62a6-464b-8d44-c15b01ec04e5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2758.748125] env[62277]: DEBUG oslo_vmware.api [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Waiting for the task: (returnval){ [ 2758.748125] env[62277]: value = "task-1405539" [ 2758.748125] env[62277]: _type = "Task" [ 2758.748125] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2758.756777] env[62277]: DEBUG oslo_vmware.api [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Task: {'id': task-1405539, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2759.259078] env[62277]: DEBUG oslo_vmware.exceptions [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2759.259412] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2759.259984] env[62277]: ERROR nova.compute.manager [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2759.259984] env[62277]: Faults: ['InvalidArgument'] [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Traceback (most recent call last): [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] yield resources [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] self.driver.spawn(context, instance, image_meta, [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] self._fetch_image_if_missing(context, vi) [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] image_cache(vi, tmp_image_ds_loc) [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] vm_util.copy_virtual_disk( [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] session._wait_for_task(vmdk_copy_task) [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] return self.wait_for_task(task_ref) [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] return evt.wait() [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] result = hub.switch() [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] return self.greenlet.switch() [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] self.f(*self.args, **self.kw) [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] raise exceptions.translate_fault(task_info.error) [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Faults: ['InvalidArgument'] [ 2759.259984] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] [ 2759.261210] env[62277]: INFO nova.compute.manager [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Terminating instance [ 2759.261900] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2759.262148] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2759.262406] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8af8b8fb-3b1e-484d-9fce-350044440956 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2759.265512] env[62277]: DEBUG nova.compute.manager [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2759.265714] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2759.266525] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c182209-32ac-4a27-84dd-104a77b593dd {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2759.273563] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2759.273780] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-03747843-71a8-4cc9-ba41-c5943be9ffad {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2759.276176] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2759.276349] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2759.277383] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d1ada945-310a-4f4b-ba5c-2e2f981cb969 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2759.282007] env[62277]: DEBUG oslo_vmware.api [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for the task: (returnval){ [ 2759.282007] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52a89396-67f2-26f2-847c-abf308af8f6c" [ 2759.282007] env[62277]: _type = "Task" [ 2759.282007] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2759.289422] env[62277]: DEBUG oslo_vmware.api [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52a89396-67f2-26f2-847c-abf308af8f6c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2759.351257] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2759.351474] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2759.351650] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Deleting the datastore file [datastore2] 26a7549d-94b4-4113-ab8b-10886eafcd49 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2759.351911] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-81dce745-548d-497f-a86b-57a4fc95ccc7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2759.357828] env[62277]: DEBUG oslo_vmware.api [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Waiting for the task: (returnval){ [ 2759.357828] env[62277]: value = "task-1405541" [ 2759.357828] env[62277]: _type = "Task" [ 2759.357828] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2759.366179] env[62277]: DEBUG oslo_vmware.api [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Task: {'id': task-1405541, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2759.792153] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2759.792511] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Creating directory with path [datastore2] vmware_temp/f87c612f-ca0e-4b55-a4d7-10f737a900d6/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2759.792643] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e573ba73-49b3-4ca1-bba5-8d202d736eb0 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2759.803927] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Created directory with path [datastore2] vmware_temp/f87c612f-ca0e-4b55-a4d7-10f737a900d6/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2759.804072] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Fetch image to [datastore2] vmware_temp/f87c612f-ca0e-4b55-a4d7-10f737a900d6/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2759.804250] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/f87c612f-ca0e-4b55-a4d7-10f737a900d6/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2759.804949] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03144da6-d1fd-436a-8396-9d21bf8caa28 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2759.812046] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c71ad16-7193-41ee-a0a4-10a261fc16a5 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2759.821446] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9283418e-9a2e-4ae7-afdd-3bab102402cc {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2759.850862] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e536372-fe8e-4fb7-a369-7d98b084443d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2759.856519] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c28c75c5-30b3-434c-a215-a33fefab91a6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2759.865332] env[62277]: DEBUG oslo_vmware.api [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Task: {'id': task-1405541, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072542} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2759.865579] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2759.865790] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2759.865967] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2759.866160] env[62277]: INFO nova.compute.manager [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2759.868366] env[62277]: DEBUG nova.compute.claims [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2759.868521] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2759.868728] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2759.879225] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2759.930876] env[62277]: DEBUG oslo_vmware.rw_handles [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f87c612f-ca0e-4b55-a4d7-10f737a900d6/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2759.990269] env[62277]: DEBUG oslo_vmware.rw_handles [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2759.990500] env[62277]: DEBUG oslo_vmware.rw_handles [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f87c612f-ca0e-4b55-a4d7-10f737a900d6/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2760.030920] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6336ebb1-9bc0-4459-9237-204bbd7b13ef {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2760.038503] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2a9c847-dd9b-4f41-8867-44a6fce2f4b3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2760.069095] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e32cf0e6-5b6a-4624-a805-83dececafa3c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2760.076217] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d290b511-a8e6-4367-b9ea-2f189fb67338 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2760.089351] env[62277]: DEBUG nova.compute.provider_tree [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2760.098042] env[62277]: DEBUG nova.scheduler.client.report [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2760.112615] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.244s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2760.113163] env[62277]: ERROR nova.compute.manager [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2760.113163] env[62277]: Faults: ['InvalidArgument'] [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Traceback (most recent call last): [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] self.driver.spawn(context, instance, image_meta, [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] self._fetch_image_if_missing(context, vi) [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] image_cache(vi, tmp_image_ds_loc) [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] vm_util.copy_virtual_disk( [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] session._wait_for_task(vmdk_copy_task) [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] return self.wait_for_task(task_ref) [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] return evt.wait() [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] result = hub.switch() [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] return self.greenlet.switch() [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] self.f(*self.args, **self.kw) [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] raise exceptions.translate_fault(task_info.error) [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Faults: ['InvalidArgument'] [ 2760.113163] env[62277]: ERROR nova.compute.manager [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] [ 2760.113892] env[62277]: DEBUG nova.compute.utils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2760.115519] env[62277]: DEBUG nova.compute.manager [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Build of instance 26a7549d-94b4-4113-ab8b-10886eafcd49 was re-scheduled: A specified parameter was not correct: fileType [ 2760.115519] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2760.115890] env[62277]: DEBUG nova.compute.manager [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2760.116073] env[62277]: DEBUG nova.compute.manager [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2760.116266] env[62277]: DEBUG nova.compute.manager [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2760.116426] env[62277]: DEBUG nova.network.neutron [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2760.168890] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2760.487427] env[62277]: DEBUG nova.network.neutron [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2760.498048] env[62277]: INFO nova.compute.manager [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Took 0.38 seconds to deallocate network for instance. [ 2760.582807] env[62277]: INFO nova.scheduler.client.report [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Deleted allocations for instance 26a7549d-94b4-4113-ab8b-10886eafcd49 [ 2760.603216] env[62277]: DEBUG oslo_concurrency.lockutils [None req-3b81e803-e294-4cf1-a2f0-6230c1682976 tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "26a7549d-94b4-4113-ab8b-10886eafcd49" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 525.806s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2760.603518] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7353d946-5020-413c-a8c2-fdbc24ed782d tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "26a7549d-94b4-4113-ab8b-10886eafcd49" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 329.476s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2760.603745] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7353d946-5020-413c-a8c2-fdbc24ed782d tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Acquiring lock "26a7549d-94b4-4113-ab8b-10886eafcd49-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2760.603975] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7353d946-5020-413c-a8c2-fdbc24ed782d tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "26a7549d-94b4-4113-ab8b-10886eafcd49-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2760.604182] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7353d946-5020-413c-a8c2-fdbc24ed782d tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "26a7549d-94b4-4113-ab8b-10886eafcd49-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2760.606500] env[62277]: INFO nova.compute.manager [None req-7353d946-5020-413c-a8c2-fdbc24ed782d tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Terminating instance [ 2760.610837] env[62277]: DEBUG nova.compute.manager [None req-7353d946-5020-413c-a8c2-fdbc24ed782d tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2760.611054] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7353d946-5020-413c-a8c2-fdbc24ed782d tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2760.611394] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fb02b0a7-f16e-4f07-827a-45e3c7b97401 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2760.622430] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-249bb5cd-1f5a-4057-a559-d1317adeab02 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2760.646731] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-7353d946-5020-413c-a8c2-fdbc24ed782d tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 26a7549d-94b4-4113-ab8b-10886eafcd49 could not be found. [ 2760.646933] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-7353d946-5020-413c-a8c2-fdbc24ed782d tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2760.647130] env[62277]: INFO nova.compute.manager [None req-7353d946-5020-413c-a8c2-fdbc24ed782d tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2760.647396] env[62277]: DEBUG oslo.service.loopingcall [None req-7353d946-5020-413c-a8c2-fdbc24ed782d tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2760.647643] env[62277]: DEBUG nova.compute.manager [-] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2760.647742] env[62277]: DEBUG nova.network.neutron [-] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2760.675486] env[62277]: DEBUG nova.network.neutron [-] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2760.684300] env[62277]: INFO nova.compute.manager [-] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] Took 0.04 seconds to deallocate network for instance. [ 2760.771265] env[62277]: DEBUG oslo_concurrency.lockutils [None req-7353d946-5020-413c-a8c2-fdbc24ed782d tempest-AttachVolumeShelveTestJSON-1444003900 tempest-AttachVolumeShelveTestJSON-1444003900-project-member] Lock "26a7549d-94b4-4113-ab8b-10886eafcd49" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.168s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2760.772069] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "26a7549d-94b4-4113-ab8b-10886eafcd49" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 278.654s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2760.772267] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 26a7549d-94b4-4113-ab8b-10886eafcd49] During sync_power_state the instance has a pending task (deleting). Skip. [ 2760.772438] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "26a7549d-94b4-4113-ab8b-10886eafcd49" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2789.889527] env[62277]: DEBUG oslo_concurrency.lockutils [None req-5c40d4f2-470f-4600-9ec3-48008dc8f0f3 tempest-ServersTestJSON-1136389312 tempest-ServersTestJSON-1136389312-project-member] Acquiring lock "6555137f-42d8-4e07-8b1f-b1e431d082ad" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2794.175669] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2802.164620] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2802.168242] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2805.168658] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2805.169229] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager.update_available_resource {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2805.181135] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2805.181378] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2805.181498] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2805.181656] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=62277) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2805.183138] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2d4deea-50cc-4709-9970-a1d9c4356126 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2805.192021] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c6907fe-2add-411a-8ec8-54c1d3c0ab64 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2805.205416] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4f05ba0-7d6b-4f78-a375-70d9fb8c9348 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2805.212560] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8548eed4-1a1f-4994-bf3e-13ff91ea344d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2805.242449] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181448MB free_disk=184GB free_vcpus=48 pci_devices=None {{(pid=62277) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2805.242610] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2805.242769] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2805.295756] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 295cb2fd-b409-4d5c-8fef-12b7acd9fec0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2805.295911] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 5683f242-4848-42fa-9353-46982c3a72c0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2805.296053] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 6555137f-42d8-4e07-8b1f-b1e431d082ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2805.296180] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Instance 98c0b8a1-7f1d-4b48-b855-97abc6e015a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=62277) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2805.296360] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2805.296489] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=62277) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2805.363774] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aca5bf0f-8cd1-4d21-a782-b2aa8bb37e53 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2805.371528] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a206db4-16be-4b42-ac28-6f0aa0d4318c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2805.399898] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c49ff47-f3ee-4d33-9029-1eca74fceb08 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2805.406714] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3bdb076-f809-4bb4-9dfa-35c7c8e55c40 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2805.419460] env[62277]: DEBUG nova.compute.provider_tree [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2805.428782] env[62277]: DEBUG nova.scheduler.client.report [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2805.441404] env[62277]: DEBUG nova.compute.resource_tracker [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=62277) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2805.441587] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.199s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2806.441624] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2806.441995] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Starting heal instance info cache {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9909}} [ 2806.441995] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Rebuilding the list of instances to heal {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9913}} [ 2806.455847] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2806.455990] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2806.456133] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 6555137f-42d8-4e07-8b1f-b1e431d082ad] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2806.456259] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 98c0b8a1-7f1d-4b48-b855-97abc6e015a5] Skipping network cache update for instance because it is Building. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9922}} [ 2806.456385] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Didn't find any instances for network info cache update. {{(pid=62277) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9995}} [ 2808.168249] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2808.168624] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2808.168671] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2808.168806] env[62277]: DEBUG nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=62277) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10528}} [ 2808.876431] env[62277]: WARNING oslo_vmware.rw_handles [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2808.876431] env[62277]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2808.876431] env[62277]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2808.876431] env[62277]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2808.876431] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2808.876431] env[62277]: ERROR oslo_vmware.rw_handles response.begin() [ 2808.876431] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2808.876431] env[62277]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2808.876431] env[62277]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2808.876431] env[62277]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2808.876431] env[62277]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2808.876431] env[62277]: ERROR oslo_vmware.rw_handles [ 2808.876922] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Downloaded image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to vmware_temp/f87c612f-ca0e-4b55-a4d7-10f737a900d6/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2808.878802] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Caching image {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2808.879063] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Copying Virtual Disk [datastore2] vmware_temp/f87c612f-ca0e-4b55-a4d7-10f737a900d6/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk to [datastore2] vmware_temp/f87c612f-ca0e-4b55-a4d7-10f737a900d6/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk {{(pid=62277) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2808.879344] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c70209d1-8475-4e38-974f-b667c875ac6c {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2808.887538] env[62277]: DEBUG oslo_vmware.api [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for the task: (returnval){ [ 2808.887538] env[62277]: value = "task-1405542" [ 2808.887538] env[62277]: _type = "Task" [ 2808.887538] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2808.895070] env[62277]: DEBUG oslo_vmware.api [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Task: {'id': task-1405542, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2809.397825] env[62277]: DEBUG oslo_vmware.exceptions [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Fault InvalidArgument not matched. {{(pid=62277) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2809.398124] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2809.398702] env[62277]: ERROR nova.compute.manager [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2809.398702] env[62277]: Faults: ['InvalidArgument'] [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Traceback (most recent call last): [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] yield resources [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] self.driver.spawn(context, instance, image_meta, [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] self._fetch_image_if_missing(context, vi) [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] image_cache(vi, tmp_image_ds_loc) [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] vm_util.copy_virtual_disk( [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] session._wait_for_task(vmdk_copy_task) [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] return self.wait_for_task(task_ref) [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] return evt.wait() [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] result = hub.switch() [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] return self.greenlet.switch() [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] self.f(*self.args, **self.kw) [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] raise exceptions.translate_fault(task_info.error) [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Faults: ['InvalidArgument'] [ 2809.398702] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] [ 2809.399870] env[62277]: INFO nova.compute.manager [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Terminating instance [ 2809.400514] env[62277]: DEBUG oslo_concurrency.lockutils [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2809.400730] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2809.400970] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e59a79d6-c4f6-4452-89f9-fd5586d2ef94 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2809.403113] env[62277]: DEBUG nova.compute.manager [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2809.403306] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2809.404028] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f39ea8c-93ed-4272-8835-44d7670e47d7 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2809.410922] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Unregistering the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2809.411159] env[62277]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2117daa3-1268-4446-b706-b45ad92d1ce3 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2809.413284] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2809.413449] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=62277) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2809.414366] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f99bf7df-f5dc-489a-ba89-1cf40a5f82cb {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2809.418913] env[62277]: DEBUG oslo_vmware.api [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Waiting for the task: (returnval){ [ 2809.418913] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]529b07d3-9ebc-07c9-f5fc-8919f9dd8354" [ 2809.418913] env[62277]: _type = "Task" [ 2809.418913] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2809.425884] env[62277]: DEBUG oslo_vmware.api [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]529b07d3-9ebc-07c9-f5fc-8919f9dd8354, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2809.498656] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Unregistered the VM {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2809.498877] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Deleting contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2809.499067] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Deleting the datastore file [datastore2] 295cb2fd-b409-4d5c-8fef-12b7acd9fec0 {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2809.499330] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8bdbd985-2793-4933-ae3d-8d462f21029d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2809.505316] env[62277]: DEBUG oslo_vmware.api [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for the task: (returnval){ [ 2809.505316] env[62277]: value = "task-1405544" [ 2809.505316] env[62277]: _type = "Task" [ 2809.505316] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2809.513022] env[62277]: DEBUG oslo_vmware.api [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Task: {'id': task-1405544, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2809.929020] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Preparing fetch location {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2809.929275] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Creating directory with path [datastore2] vmware_temp/270e7201-c1c0-4d31-bc9b-fe2c682836fa/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2809.929530] env[62277]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d17d2416-7c7c-4de3-adbd-e3de54b6c853 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2809.940414] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Created directory with path [datastore2] vmware_temp/270e7201-c1c0-4d31-bc9b-fe2c682836fa/6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2809.940598] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Fetch image to [datastore2] vmware_temp/270e7201-c1c0-4d31-bc9b-fe2c682836fa/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2809.940779] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to [datastore2] vmware_temp/270e7201-c1c0-4d31-bc9b-fe2c682836fa/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk on the data store datastore2 {{(pid=62277) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2809.941583] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4c98ba4-6f2a-4908-af4f-c3a232dd3573 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2809.947865] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13f587c6-57d9-44d5-8e4f-19c871d2de5a {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2809.956836] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8532ccbc-ebc9-45a4-b215-af301762f752 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2809.987537] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82fb05e8-314a-4f09-a173-e8defcc988ed {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2809.993105] env[62277]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e6a4c828-e4a9-4016-8689-90c4877a3163 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2810.011680] env[62277]: DEBUG nova.virt.vmwareapi.images [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: 5683f242-4848-42fa-9353-46982c3a72c0] Downloading image file data 6f125163-af69-40e9-92ae-3b8a01d74b60 to the data store datastore2 {{(pid=62277) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2810.016134] env[62277]: DEBUG oslo_vmware.api [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Task: {'id': task-1405544, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063673} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2810.016355] env[62277]: DEBUG nova.virt.vmwareapi.ds_util [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Deleted the datastore file {{(pid=62277) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2810.016531] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Deleted contents of the VM from datastore datastore2 {{(pid=62277) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2810.016694] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2810.016857] env[62277]: INFO nova.compute.manager [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2810.019024] env[62277]: DEBUG nova.compute.claims [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Aborting claim: {{(pid=62277) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 2810.019178] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2810.019333] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2810.063890] env[62277]: DEBUG oslo_vmware.rw_handles [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/270e7201-c1c0-4d31-bc9b-fe2c682836fa/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2810.123036] env[62277]: DEBUG oslo_vmware.rw_handles [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Completed reading data from the image iterator. {{(pid=62277) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2810.123238] env[62277]: DEBUG oslo_vmware.rw_handles [None req-b4d6e1e3-b100-4ed5-98df-e072b62755ae tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/270e7201-c1c0-4d31-bc9b-fe2c682836fa/6f125163-af69-40e9-92ae-3b8a01d74b60/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=62277) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2810.161456] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-162cf3ea-aeca-429e-8f0a-2aedb7a7bd70 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2810.169559] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f3eef83-50ab-448b-8e43-b347a814be9b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2810.198077] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83ae98ef-12e5-4353-a4e7-8fab79721e80 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2810.204607] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4048443a-45c1-405b-92ac-4ac4aa7c117d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2810.216790] env[62277]: DEBUG nova.compute.provider_tree [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2810.227086] env[62277]: DEBUG nova.scheduler.client.report [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2810.240291] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.221s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2810.240832] env[62277]: ERROR nova.compute.manager [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2810.240832] env[62277]: Faults: ['InvalidArgument'] [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Traceback (most recent call last): [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] self.driver.spawn(context, instance, image_meta, [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] self._fetch_image_if_missing(context, vi) [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] image_cache(vi, tmp_image_ds_loc) [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] vm_util.copy_virtual_disk( [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] session._wait_for_task(vmdk_copy_task) [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] return self.wait_for_task(task_ref) [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] return evt.wait() [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] result = hub.switch() [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] return self.greenlet.switch() [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] self.f(*self.args, **self.kw) [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] raise exceptions.translate_fault(task_info.error) [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Faults: ['InvalidArgument'] [ 2810.240832] env[62277]: ERROR nova.compute.manager [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] [ 2810.241867] env[62277]: DEBUG nova.compute.utils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] VimFaultException {{(pid=62277) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2810.243026] env[62277]: DEBUG nova.compute.manager [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Build of instance 295cb2fd-b409-4d5c-8fef-12b7acd9fec0 was re-scheduled: A specified parameter was not correct: fileType [ 2810.243026] env[62277]: Faults: ['InvalidArgument'] {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2810.243388] env[62277]: DEBUG nova.compute.manager [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Unplugging VIFs for instance {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2810.243557] env[62277]: DEBUG nova.compute.manager [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=62277) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2810.243725] env[62277]: DEBUG nova.compute.manager [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2810.243883] env[62277]: DEBUG nova.network.neutron [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2810.589544] env[62277]: DEBUG nova.network.neutron [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2810.600405] env[62277]: INFO nova.compute.manager [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Took 0.36 seconds to deallocate network for instance. [ 2810.687749] env[62277]: INFO nova.scheduler.client.report [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Deleted allocations for instance 295cb2fd-b409-4d5c-8fef-12b7acd9fec0 [ 2810.707291] env[62277]: DEBUG oslo_concurrency.lockutils [None req-8ecc6e41-ac83-408f-b637-c3d5382e2c50 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "295cb2fd-b409-4d5c-8fef-12b7acd9fec0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 524.229s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2810.707541] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "295cb2fd-b409-4d5c-8fef-12b7acd9fec0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 328.589s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2810.707724] env[62277]: INFO nova.compute.manager [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] During sync_power_state the instance has a pending task (spawning). Skip. [ 2810.707895] env[62277]: DEBUG oslo_concurrency.lockutils [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Lock "295cb2fd-b409-4d5c-8fef-12b7acd9fec0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2810.708131] env[62277]: DEBUG oslo_concurrency.lockutils [None req-339afca7-4cd5-46f6-943c-50429aa5af37 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "295cb2fd-b409-4d5c-8fef-12b7acd9fec0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 328.271s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2810.708339] env[62277]: DEBUG oslo_concurrency.lockutils [None req-339afca7-4cd5-46f6-943c-50429aa5af37 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Acquiring lock "295cb2fd-b409-4d5c-8fef-12b7acd9fec0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2810.708681] env[62277]: DEBUG oslo_concurrency.lockutils [None req-339afca7-4cd5-46f6-943c-50429aa5af37 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "295cb2fd-b409-4d5c-8fef-12b7acd9fec0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2810.708924] env[62277]: DEBUG oslo_concurrency.lockutils [None req-339afca7-4cd5-46f6-943c-50429aa5af37 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "295cb2fd-b409-4d5c-8fef-12b7acd9fec0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2810.711351] env[62277]: INFO nova.compute.manager [None req-339afca7-4cd5-46f6-943c-50429aa5af37 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Terminating instance [ 2810.713638] env[62277]: DEBUG nova.compute.manager [None req-339afca7-4cd5-46f6-943c-50429aa5af37 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Start destroying the instance on the hypervisor. {{(pid=62277) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2810.713972] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-339afca7-4cd5-46f6-943c-50429aa5af37 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Destroying instance {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2810.714515] env[62277]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d0c159d1-2cd2-441c-8420-d587d79da7b6 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2810.723827] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3e8e35b-4239-4099-b626-7c72f402cd67 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2810.748585] env[62277]: WARNING nova.virt.vmwareapi.vmops [None req-339afca7-4cd5-46f6-943c-50429aa5af37 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 295cb2fd-b409-4d5c-8fef-12b7acd9fec0 could not be found. [ 2810.748814] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-339afca7-4cd5-46f6-943c-50429aa5af37 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Instance destroyed {{(pid=62277) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2810.749015] env[62277]: INFO nova.compute.manager [None req-339afca7-4cd5-46f6-943c-50429aa5af37 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2810.749289] env[62277]: DEBUG oslo.service.loopingcall [None req-339afca7-4cd5-46f6-943c-50429aa5af37 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2810.749539] env[62277]: DEBUG nova.compute.manager [-] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Deallocating network for instance {{(pid=62277) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2810.749649] env[62277]: DEBUG nova.network.neutron [-] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] deallocate_for_instance() {{(pid=62277) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2810.774972] env[62277]: DEBUG nova.network.neutron [-] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Updating instance_info_cache with network_info: [] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2810.782769] env[62277]: INFO nova.compute.manager [-] [instance: 295cb2fd-b409-4d5c-8fef-12b7acd9fec0] Took 0.03 seconds to deallocate network for instance. [ 2810.862860] env[62277]: DEBUG oslo_concurrency.lockutils [None req-339afca7-4cd5-46f6-943c-50429aa5af37 tempest-ServerDiskConfigTestJSON-774955912 tempest-ServerDiskConfigTestJSON-774955912-project-member] Lock "295cb2fd-b409-4d5c-8fef-12b7acd9fec0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.154s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2817.165326] env[62277]: DEBUG oslo_service.periodic_task [None req-e4467ea7-d217-48fb-a66c-a8dec538ebdc None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=62277) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2829.968380] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "f04c6a49-b3aa-4618-a962-6bc7b7535ece" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2829.968783] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "f04c6a49-b3aa-4618-a962-6bc7b7535ece" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2829.979474] env[62277]: DEBUG nova.compute.manager [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Starting instance... {{(pid=62277) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2830.029255] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2830.029499] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2830.031170] env[62277]: INFO nova.compute.claims [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2830.135016] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee3bdd01-c420-4831-a933-1ea2d8529919 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2830.141535] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-653a0a98-3752-4a7a-9cc4-5bcd2b7a6c4f {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2830.170583] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c6dab88-df40-4686-b338-cf2f9c357394 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2830.177295] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a328940-184a-4433-bc7c-8fe5b9e8b15d {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2830.189788] env[62277]: DEBUG nova.compute.provider_tree [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Inventory has not changed in ProviderTree for provider: 75e125ea-a599-4b65-b9cd-6ea881735292 {{(pid=62277) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2830.198122] env[62277]: DEBUG nova.scheduler.client.report [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Inventory has not changed for provider 75e125ea-a599-4b65-b9cd-6ea881735292 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 184, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=62277) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2830.210580] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.181s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2830.211212] env[62277]: DEBUG nova.compute.manager [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Start building networks asynchronously for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2830.242533] env[62277]: DEBUG nova.compute.utils [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Using /dev/sd instead of None {{(pid=62277) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2830.243972] env[62277]: DEBUG nova.compute.manager [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Allocating IP information in the background. {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2830.244313] env[62277]: DEBUG nova.network.neutron [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] allocate_for_instance() {{(pid=62277) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2830.254505] env[62277]: DEBUG nova.compute.manager [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Start building block device mappings for instance. {{(pid=62277) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2830.307154] env[62277]: DEBUG nova.policy [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '00ed93b61873452bbc15280d2de65bd8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c951cee39d94e49af963590cccf95fb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=62277) authorize /opt/stack/nova/nova/policy.py:203}} [ 2830.318364] env[62277]: DEBUG nova.compute.manager [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Start spawning the instance on the hypervisor. {{(pid=62277) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2830.346101] env[62277]: DEBUG nova.virt.hardware [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-11-06T22:11:30Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-11-06T22:11:15Z,direct_url=,disk_format='vmdk',id=6f125163-af69-40e9-92ae-3b8a01d74b60,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='63ed6630e9c140baa826f53d7a0564d1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-11-06T22:11:16Z,virtual_size=,visibility=), allow threads: False {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2830.346101] env[62277]: DEBUG nova.virt.hardware [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Flavor limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2830.346101] env[62277]: DEBUG nova.virt.hardware [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Image limits 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2830.346101] env[62277]: DEBUG nova.virt.hardware [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Flavor pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2830.346101] env[62277]: DEBUG nova.virt.hardware [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Image pref 0:0:0 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2830.346101] env[62277]: DEBUG nova.virt.hardware [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=62277) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2830.346101] env[62277]: DEBUG nova.virt.hardware [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2830.346101] env[62277]: DEBUG nova.virt.hardware [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2830.346101] env[62277]: DEBUG nova.virt.hardware [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Got 1 possible topologies {{(pid=62277) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2830.347084] env[62277]: DEBUG nova.virt.hardware [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2830.347473] env[62277]: DEBUG nova.virt.hardware [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=62277) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2830.348764] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25fcd34d-e49d-4ff5-933e-1a34a9f96a5b {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2830.357926] env[62277]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3f42bc3-29e3-4f81-8257-1dc03592cd17 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2830.622011] env[62277]: DEBUG nova.network.neutron [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Successfully created port: 8947f5b0-29ed-435b-a7ea-a3264a65bdb4 {{(pid=62277) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2831.405016] env[62277]: DEBUG nova.compute.manager [req-067e59eb-0bba-473a-b201-b7b54c6cfa0b req-e78f764c-8154-4edf-9bd0-58cbdf8e8968 service nova] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Received event network-vif-plugged-8947f5b0-29ed-435b-a7ea-a3264a65bdb4 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2831.405256] env[62277]: DEBUG oslo_concurrency.lockutils [req-067e59eb-0bba-473a-b201-b7b54c6cfa0b req-e78f764c-8154-4edf-9bd0-58cbdf8e8968 service nova] Acquiring lock "f04c6a49-b3aa-4618-a962-6bc7b7535ece-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2831.405473] env[62277]: DEBUG oslo_concurrency.lockutils [req-067e59eb-0bba-473a-b201-b7b54c6cfa0b req-e78f764c-8154-4edf-9bd0-58cbdf8e8968 service nova] Lock "f04c6a49-b3aa-4618-a962-6bc7b7535ece-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2831.405646] env[62277]: DEBUG oslo_concurrency.lockutils [req-067e59eb-0bba-473a-b201-b7b54c6cfa0b req-e78f764c-8154-4edf-9bd0-58cbdf8e8968 service nova] Lock "f04c6a49-b3aa-4618-a962-6bc7b7535ece-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=62277) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2831.405808] env[62277]: DEBUG nova.compute.manager [req-067e59eb-0bba-473a-b201-b7b54c6cfa0b req-e78f764c-8154-4edf-9bd0-58cbdf8e8968 service nova] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] No waiting events found dispatching network-vif-plugged-8947f5b0-29ed-435b-a7ea-a3264a65bdb4 {{(pid=62277) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2831.405969] env[62277]: WARNING nova.compute.manager [req-067e59eb-0bba-473a-b201-b7b54c6cfa0b req-e78f764c-8154-4edf-9bd0-58cbdf8e8968 service nova] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Received unexpected event network-vif-plugged-8947f5b0-29ed-435b-a7ea-a3264a65bdb4 for instance with vm_state building and task_state spawning. [ 2831.482095] env[62277]: DEBUG nova.network.neutron [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Successfully updated port: 8947f5b0-29ed-435b-a7ea-a3264a65bdb4 {{(pid=62277) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2831.498617] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "refresh_cache-f04c6a49-b3aa-4618-a962-6bc7b7535ece" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2831.498824] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquired lock "refresh_cache-f04c6a49-b3aa-4618-a962-6bc7b7535ece" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2831.499063] env[62277]: DEBUG nova.network.neutron [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Building network info cache for instance {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2831.552519] env[62277]: DEBUG nova.network.neutron [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Instance cache missing network info. {{(pid=62277) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2831.703422] env[62277]: DEBUG nova.network.neutron [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Updating instance_info_cache with network_info: [{"id": "8947f5b0-29ed-435b-a7ea-a3264a65bdb4", "address": "fa:16:3e:16:bf:c8", "network": {"id": "73ff19f8-2a7e-46b5-ad84-44eb5672dd85", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1025699415-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c951cee39d94e49af963590cccf95fb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "09bf081b-cdf0-4977-abe2-2339a87409ab", "external-id": "nsx-vlan-transportzone-378", "segmentation_id": 378, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8947f5b0-29", "ovs_interfaceid": "8947f5b0-29ed-435b-a7ea-a3264a65bdb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2831.714085] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Releasing lock "refresh_cache-f04c6a49-b3aa-4618-a962-6bc7b7535ece" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2831.714363] env[62277]: DEBUG nova.compute.manager [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Instance network_info: |[{"id": "8947f5b0-29ed-435b-a7ea-a3264a65bdb4", "address": "fa:16:3e:16:bf:c8", "network": {"id": "73ff19f8-2a7e-46b5-ad84-44eb5672dd85", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1025699415-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c951cee39d94e49af963590cccf95fb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "09bf081b-cdf0-4977-abe2-2339a87409ab", "external-id": "nsx-vlan-transportzone-378", "segmentation_id": 378, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8947f5b0-29", "ovs_interfaceid": "8947f5b0-29ed-435b-a7ea-a3264a65bdb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=62277) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2831.714762] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:16:bf:c8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '09bf081b-cdf0-4977-abe2-2339a87409ab', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8947f5b0-29ed-435b-a7ea-a3264a65bdb4', 'vif_model': 'vmxnet3'}] {{(pid=62277) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2831.722414] env[62277]: DEBUG oslo.service.loopingcall [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=62277) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2831.723364] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Creating VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2831.723730] env[62277]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5b4f9cf9-d675-4a05-8ca7-ff8a6dddfdef {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2831.743776] env[62277]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2831.743776] env[62277]: value = "task-1405545" [ 2831.743776] env[62277]: _type = "Task" [ 2831.743776] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2831.751522] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405545, 'name': CreateVM_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2832.254299] env[62277]: DEBUG oslo_vmware.api [-] Task: {'id': task-1405545, 'name': CreateVM_Task, 'duration_secs': 0.286101} completed successfully. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2832.254472] env[62277]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Created VM on the ESX host {{(pid=62277) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2832.255183] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2832.255350] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquired lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2832.255683] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2832.255939] env[62277]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e020d3fe-3c33-4c15-be75-3a33d123a740 {{(pid=62277) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2832.262775] env[62277]: DEBUG oslo_vmware.api [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Waiting for the task: (returnval){ [ 2832.262775] env[62277]: value = "session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52c04737-2415-9e2d-5ed9-73ee39561f15" [ 2832.262775] env[62277]: _type = "Task" [ 2832.262775] env[62277]: } to complete. {{(pid=62277) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2832.273497] env[62277]: DEBUG oslo_vmware.api [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Task: {'id': session[522b6a8e-22a7-4f2b-cf83-8c2495219406]52c04737-2415-9e2d-5ed9-73ee39561f15, 'name': SearchDatastore_Task} progress is 0%. {{(pid=62277) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2832.773957] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Releasing lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2832.774326] env[62277]: DEBUG nova.virt.vmwareapi.vmops [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Processing image 6f125163-af69-40e9-92ae-3b8a01d74b60 {{(pid=62277) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2832.774493] env[62277]: DEBUG oslo_concurrency.lockutils [None req-cb03c73d-edc7-4374-9831-d2136f0d0e5d tempest-DeleteServersTestJSON-689790343 tempest-DeleteServersTestJSON-689790343-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/6f125163-af69-40e9-92ae-3b8a01d74b60/6f125163-af69-40e9-92ae-3b8a01d74b60.vmdk" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2833.434773] env[62277]: DEBUG nova.compute.manager [req-ac017351-d449-4cf3-9851-3b2fc8b5e7b5 req-6a78794f-546a-41ca-bb90-1202ed9ee0a5 service nova] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Received event network-changed-8947f5b0-29ed-435b-a7ea-a3264a65bdb4 {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11099}} [ 2833.434979] env[62277]: DEBUG nova.compute.manager [req-ac017351-d449-4cf3-9851-3b2fc8b5e7b5 req-6a78794f-546a-41ca-bb90-1202ed9ee0a5 service nova] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Refreshing instance network info cache due to event network-changed-8947f5b0-29ed-435b-a7ea-a3264a65bdb4. {{(pid=62277) external_instance_event /opt/stack/nova/nova/compute/manager.py:11104}} [ 2833.435311] env[62277]: DEBUG oslo_concurrency.lockutils [req-ac017351-d449-4cf3-9851-3b2fc8b5e7b5 req-6a78794f-546a-41ca-bb90-1202ed9ee0a5 service nova] Acquiring lock "refresh_cache-f04c6a49-b3aa-4618-a962-6bc7b7535ece" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2833.435481] env[62277]: DEBUG oslo_concurrency.lockutils [req-ac017351-d449-4cf3-9851-3b2fc8b5e7b5 req-6a78794f-546a-41ca-bb90-1202ed9ee0a5 service nova] Acquired lock "refresh_cache-f04c6a49-b3aa-4618-a962-6bc7b7535ece" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2833.435644] env[62277]: DEBUG nova.network.neutron [req-ac017351-d449-4cf3-9851-3b2fc8b5e7b5 req-6a78794f-546a-41ca-bb90-1202ed9ee0a5 service nova] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Refreshing network info cache for port 8947f5b0-29ed-435b-a7ea-a3264a65bdb4 {{(pid=62277) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2833.849562] env[62277]: DEBUG nova.network.neutron [req-ac017351-d449-4cf3-9851-3b2fc8b5e7b5 req-6a78794f-546a-41ca-bb90-1202ed9ee0a5 service nova] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Updated VIF entry in instance network info cache for port 8947f5b0-29ed-435b-a7ea-a3264a65bdb4. {{(pid=62277) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2833.849901] env[62277]: DEBUG nova.network.neutron [req-ac017351-d449-4cf3-9851-3b2fc8b5e7b5 req-6a78794f-546a-41ca-bb90-1202ed9ee0a5 service nova] [instance: f04c6a49-b3aa-4618-a962-6bc7b7535ece] Updating instance_info_cache with network_info: [{"id": "8947f5b0-29ed-435b-a7ea-a3264a65bdb4", "address": "fa:16:3e:16:bf:c8", "network": {"id": "73ff19f8-2a7e-46b5-ad84-44eb5672dd85", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1025699415-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c951cee39d94e49af963590cccf95fb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "09bf081b-cdf0-4977-abe2-2339a87409ab", "external-id": "nsx-vlan-transportzone-378", "segmentation_id": 378, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8947f5b0-29", "ovs_interfaceid": "8947f5b0-29ed-435b-a7ea-a3264a65bdb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=62277) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2833.858999] env[62277]: DEBUG oslo_concurrency.lockutils [req-ac017351-d449-4cf3-9851-3b2fc8b5e7b5 req-6a78794f-546a-41ca-bb90-1202ed9ee0a5 service nova] Releasing lock "refresh_cache-f04c6a49-b3aa-4618-a962-6bc7b7535ece" {{(pid=62277) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}}