[ 552.219449] env[59473]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 552.770152] env[59518]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 554.314543] env[59518]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=59518) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 554.314907] env[59518]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=59518) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 554.314946] env[59518]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=59518) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 554.315228] env[59518]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 554.316317] env[59518]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 554.428766] env[59518]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=59518) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 554.438239] env[59518]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.009s {{(pid=59518) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 554.533368] env[59518]: INFO nova.virt.driver [None req-01161e66-312a-4f5f-b5c4-11197e24aaf3 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 554.610125] env[59518]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 554.610467] env[59518]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.001s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 554.610677] env[59518]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=59518) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 557.722346] env[59518]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-96d829b8-10dc-4857-9aee-f013c4517521 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.738112] env[59518]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=59518) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 557.738222] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-90d9f0c3-73e1-4b80-bc39-1cb3d79edc2a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.762391] env[59518]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 31b66. [ 557.762501] env[59518]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.152s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 557.763129] env[59518]: INFO nova.virt.vmwareapi.driver [None req-01161e66-312a-4f5f-b5c4-11197e24aaf3 None None] VMware vCenter version: 7.0.3 [ 557.766515] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb8289d9-459a-4bb7-a1d6-be2f51b54451 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.783893] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f22da0cd-f7a5-47d0-ac49-42b8eb949ff8 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.790332] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49970bfa-5fc0-4aed-a590-6558602e63b4 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.796731] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46c87b7c-317f-4f97-9f20-289c434b0b99 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.809450] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94874e52-83e6-4f90-afa5-ebde76712ac5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.815206] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03c0e857-6388-4411-83a4-9d28a71432c5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.845243] env[59518]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-09fc70e8-9b71-4d63-a162-2f4eb2dcaac8 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.850779] env[59518]: DEBUG nova.virt.vmwareapi.driver [None req-01161e66-312a-4f5f-b5c4-11197e24aaf3 None None] Extension org.openstack.compute already exists. {{(pid=59518) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 557.853443] env[59518]: INFO nova.compute.provider_config [None req-01161e66-312a-4f5f-b5c4-11197e24aaf3 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 557.871174] env[59518]: DEBUG nova.context [None req-01161e66-312a-4f5f-b5c4-11197e24aaf3 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),368c1eff-c63f-4f60-9cdc-f434ef763358(cell1) {{(pid=59518) load_cells /opt/stack/nova/nova/context.py:464}} [ 557.873214] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 557.873434] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 557.874181] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 557.874532] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] Acquiring lock "368c1eff-c63f-4f60-9cdc-f434ef763358" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 557.874714] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] Lock "368c1eff-c63f-4f60-9cdc-f434ef763358" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 557.875682] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] Lock "368c1eff-c63f-4f60-9cdc-f434ef763358" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 557.901811] env[59518]: INFO dbcounter [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] Registered counter for database nova_cell0 [ 557.909889] env[59518]: INFO dbcounter [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] Registered counter for database nova_cell1 [ 557.912889] env[59518]: DEBUG oslo_db.sqlalchemy.engines [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59518) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 557.913231] env[59518]: DEBUG oslo_db.sqlalchemy.engines [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59518) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 557.917527] env[59518]: DEBUG dbcounter [-] [59518] Writer thread running {{(pid=59518) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:99}} [ 557.918218] env[59518]: DEBUG dbcounter [-] [59518] Writer thread running {{(pid=59518) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:99}} [ 557.920458] env[59518]: ERROR nova.db.main.api [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 557.920458] env[59518]: result = function(*args, **kwargs) [ 557.920458] env[59518]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 557.920458] env[59518]: return func(*args, **kwargs) [ 557.920458] env[59518]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 557.920458] env[59518]: result = fn(*args, **kwargs) [ 557.920458] env[59518]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 557.920458] env[59518]: return f(*args, **kwargs) [ 557.920458] env[59518]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 557.920458] env[59518]: return db.service_get_minimum_version(context, binaries) [ 557.920458] env[59518]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 557.920458] env[59518]: _check_db_access() [ 557.920458] env[59518]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 557.920458] env[59518]: stacktrace = ''.join(traceback.format_stack()) [ 557.920458] env[59518]: [ 557.921239] env[59518]: ERROR nova.db.main.api [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 557.921239] env[59518]: result = function(*args, **kwargs) [ 557.921239] env[59518]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 557.921239] env[59518]: return func(*args, **kwargs) [ 557.921239] env[59518]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 557.921239] env[59518]: result = fn(*args, **kwargs) [ 557.921239] env[59518]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 557.921239] env[59518]: return f(*args, **kwargs) [ 557.921239] env[59518]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 557.921239] env[59518]: return db.service_get_minimum_version(context, binaries) [ 557.921239] env[59518]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 557.921239] env[59518]: _check_db_access() [ 557.921239] env[59518]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 557.921239] env[59518]: stacktrace = ''.join(traceback.format_stack()) [ 557.921239] env[59518]: [ 557.921577] env[59518]: WARNING nova.objects.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] Failed to get minimum service version for cell 368c1eff-c63f-4f60-9cdc-f434ef763358 [ 557.921715] env[59518]: WARNING nova.objects.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 557.922172] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] Acquiring lock "singleton_lock" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 557.922327] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] Acquired lock "singleton_lock" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 557.922563] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] Releasing lock "singleton_lock" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 557.922872] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] Full set of CONF: {{(pid=59518) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 557.923008] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ******************************************************************************** {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 557.923130] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] Configuration options gathered from: {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 557.923261] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 557.923451] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 557.923579] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ================================================================================ {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 557.923781] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] allow_resize_to_same_host = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.923955] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] arq_binding_timeout = 300 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.924094] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] backdoor_port = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.924220] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] backdoor_socket = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.924383] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] block_device_allocate_retries = 60 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.924543] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] block_device_allocate_retries_interval = 3 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.924707] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cert = self.pem {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.924867] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.925041] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] compute_monitors = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.925202] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] config_dir = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.925368] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] config_drive_format = iso9660 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.925498] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.925658] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] config_source = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.925819] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] console_host = devstack {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.925978] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] control_exchange = nova {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.926133] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cpu_allocation_ratio = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.926286] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] daemon = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.926446] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] debug = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.926598] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] default_access_ip_network_name = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.926755] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] default_availability_zone = nova {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.926904] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] default_ephemeral_format = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.927330] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.927330] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] default_schedule_zone = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.927465] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] disk_allocation_ratio = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.927569] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] enable_new_services = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.927737] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] enabled_apis = ['osapi_compute'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.927897] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] enabled_ssl_apis = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.928118] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] flat_injected = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.928289] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] force_config_drive = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.928445] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] force_raw_images = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.928609] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] graceful_shutdown_timeout = 5 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.928787] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] heal_instance_info_cache_interval = 60 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.929015] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] host = cpu-1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.929188] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.929350] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] initial_disk_allocation_ratio = 1.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.929508] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] initial_ram_allocation_ratio = 1.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.929714] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.929874] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] instance_build_timeout = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.930078] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] instance_delete_interval = 300 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.930183] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] instance_format = [instance: %(uuid)s] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.930341] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] instance_name_template = instance-%08x {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.930496] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] instance_usage_audit = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.930658] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] instance_usage_audit_period = month {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.930818] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.931009] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] instances_path = /opt/stack/data/nova/instances {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.931367] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] internal_service_availability_zone = internal {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.931367] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] key = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.931518] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] live_migration_retry_count = 30 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.931629] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] log_config_append = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.931789] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.931969] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] log_dir = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.932145] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] log_file = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.932271] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] log_options = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.932429] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] log_rotate_interval = 1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.932591] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] log_rotate_interval_type = days {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.932750] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] log_rotation_type = none {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.932874] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.932987] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.933149] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.933306] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.933429] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.933587] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] long_rpc_timeout = 1800 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.933743] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] max_concurrent_builds = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.933901] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] max_concurrent_live_migrations = 1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.934054] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] max_concurrent_snapshots = 5 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.934207] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] max_local_block_devices = 3 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.934359] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] max_logfile_count = 30 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.934512] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] max_logfile_size_mb = 200 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.934666] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] maximum_instance_delete_attempts = 5 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.934827] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] metadata_listen = 0.0.0.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.934996] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] metadata_listen_port = 8775 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.935160] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] metadata_workers = 2 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.935315] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] migrate_max_retries = -1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.935605] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] mkisofs_cmd = genisoimage {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.935836] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] my_block_storage_ip = 10.180.1.21 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.935970] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] my_ip = 10.180.1.21 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.936746] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] network_allocate_retries = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.936952] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.937131] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] osapi_compute_listen = 0.0.0.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.937298] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] osapi_compute_listen_port = 8774 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.937466] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] osapi_compute_unique_server_name_scope = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.937631] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] osapi_compute_workers = 2 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.937790] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] password_length = 12 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.937949] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] periodic_enable = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.938111] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] periodic_fuzzy_delay = 60 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.938275] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] pointer_model = usbtablet {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.938435] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] preallocate_images = none {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.938590] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] publish_errors = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.938718] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] pybasedir = /opt/stack/nova {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.938902] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ram_allocation_ratio = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.939065] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] rate_limit_burst = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.939517] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] rate_limit_except_level = CRITICAL {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.939712] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] rate_limit_interval = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.939876] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] reboot_timeout = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.940198] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] reclaim_instance_interval = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.940384] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] record = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.940556] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] reimage_timeout_per_gb = 60 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.940718] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] report_interval = 120 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.940902] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] rescue_timeout = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.941072] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] reserved_host_cpus = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.941230] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] reserved_host_disk_mb = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.941383] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] reserved_host_memory_mb = 512 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.941538] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] reserved_huge_pages = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.941692] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] resize_confirm_window = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.941861] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] resize_fs_using_block_device = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.942030] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] resume_guests_state_on_host_boot = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.942192] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.942348] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] rpc_response_timeout = 60 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.942504] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] run_external_periodic_tasks = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.942667] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] running_deleted_instance_action = reap {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.942822] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] running_deleted_instance_poll_interval = 1800 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.942974] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] running_deleted_instance_timeout = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.943126] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] scheduler_instance_sync_interval = 120 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.943255] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] service_down_time = 300 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.943415] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] servicegroup_driver = db {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.943570] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] shelved_offload_time = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.943723] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] shelved_poll_interval = 3600 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.943886] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] shutdown_timeout = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.944056] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] source_is_ipv6 = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.944213] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ssl_only = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.944460] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.944624] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] sync_power_state_interval = 600 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.944782] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] sync_power_state_pool_size = 1000 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.944945] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] syslog_log_facility = LOG_USER {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.945097] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] tempdir = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.945333] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] timeout_nbd = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.945515] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] transport_url = **** {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.945674] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] update_resources_interval = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.945830] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] use_cow_images = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.945988] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] use_eventlog = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.946142] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] use_journal = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.946292] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] use_json = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.946444] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] use_rootwrap_daemon = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.946595] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] use_stderr = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.946745] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] use_syslog = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.947029] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vcpu_pin_set = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.947172] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vif_plugging_is_fatal = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.947220] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vif_plugging_timeout = 300 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.947350] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] virt_mkfs = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.947503] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] volume_usage_poll_interval = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.947657] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] watch_log_file = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.947815] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] web = /usr/share/spice-html5 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 557.948008] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_concurrency.disable_process_locking = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.948333] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.948516] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.948681] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.948912] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.949109] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.949320] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.949545] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.auth_strategy = keystone {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.949760] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.compute_link_prefix = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.949979] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.950196] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.dhcp_domain = novalocal {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.950404] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.enable_instance_password = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.950628] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.glance_link_prefix = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.950864] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.951090] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.951278] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.instance_list_per_project_cells = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.951475] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.list_records_by_skipping_down_cells = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.951674] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.local_metadata_per_cell = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.951894] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.max_limit = 1000 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.952136] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.metadata_cache_expiration = 15 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.952359] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.neutron_default_tenant_id = default {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.952567] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.use_forwarded_for = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.952771] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.use_neutron_default_nets = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.952994] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.953216] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.953436] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.953667] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.953908] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.vendordata_dynamic_targets = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.954133] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.vendordata_jsonfile_path = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.954377] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.954615] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.backend = dogpile.cache.memcached {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.954840] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.backend_argument = **** {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.955078] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.config_prefix = cache.oslo {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.955295] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.dead_timeout = 60.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.955475] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.debug_cache_backend = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.955679] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.enable_retry_client = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.955879] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.enable_socket_keepalive = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.956154] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.enabled = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.956359] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.expiration_time = 600 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.956576] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.hashclient_retry_attempts = 2 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.956787] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.hashclient_retry_delay = 1.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.957011] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.memcache_dead_retry = 300 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.957237] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.memcache_password = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.957457] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.957748] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.957979] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.memcache_pool_maxsize = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.958194] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.958412] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.memcache_sasl_enabled = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.958657] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.958885] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.memcache_socket_timeout = 1.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.959099] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.memcache_username = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.959302] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.proxies = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.959525] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.retry_attempts = 2 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.959742] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.retry_delay = 0.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.959936] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.socket_keepalive_count = 1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.960198] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.socket_keepalive_idle = 1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.960428] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.socket_keepalive_interval = 1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.960643] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.tls_allowed_ciphers = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.960865] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.tls_cafile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.961083] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.tls_certfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.961295] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.tls_enabled = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.961505] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cache.tls_keyfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.961734] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cinder.auth_section = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.961958] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cinder.auth_type = password {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.962181] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cinder.cafile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.962394] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cinder.catalog_info = volumev3::publicURL {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.962592] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cinder.certfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.962814] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cinder.collect_timing = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.963033] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cinder.cross_az_attach = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.963269] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cinder.debug = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.963496] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cinder.endpoint_template = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.963724] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cinder.http_retries = 3 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.963940] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cinder.insecure = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.964186] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cinder.keyfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.964428] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cinder.os_region_name = RegionOne {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.964664] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cinder.split_loggers = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.964877] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cinder.timeout = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.965110] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.965394] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] compute.cpu_dedicated_set = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.965650] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] compute.cpu_shared_set = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.965882] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] compute.image_type_exclude_list = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.966106] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.966337] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] compute.max_concurrent_disk_ops = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.966559] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] compute.max_disk_devices_to_attach = -1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.966769] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.966986] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.967499] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] compute.resource_provider_association_refresh = 300 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.967499] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] compute.shutdown_retry_interval = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.967639] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.967894] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] conductor.workers = 2 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.968156] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] console.allowed_origins = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972150] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] console.ssl_ciphers = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972150] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] console.ssl_minimum_version = default {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972150] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] consoleauth.token_ttl = 600 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972150] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cyborg.cafile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972150] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cyborg.certfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972150] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cyborg.collect_timing = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972150] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cyborg.connect_retries = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972582] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cyborg.connect_retry_delay = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972582] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cyborg.endpoint_override = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972582] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cyborg.insecure = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972582] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cyborg.keyfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972582] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cyborg.max_version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972582] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cyborg.min_version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972582] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cyborg.region_name = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972981] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cyborg.service_name = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972981] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cyborg.service_type = accelerator {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972981] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cyborg.split_loggers = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972981] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cyborg.status_code_retries = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972981] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cyborg.status_code_retry_delay = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972981] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cyborg.timeout = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.972981] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.973357] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] cyborg.version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.973357] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.backend = sqlalchemy {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.973357] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.connection = **** {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.973357] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.connection_debug = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.973357] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.connection_parameters = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.973593] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.connection_recycle_time = 3600 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.973777] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.connection_trace = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.973987] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.db_inc_retry_interval = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.974208] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.db_max_retries = 20 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.974416] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.db_max_retry_interval = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.974632] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.db_retry_interval = 1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.974845] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.max_overflow = 50 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.975046] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.max_pool_size = 5 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.975225] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.max_retries = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.975408] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.mysql_enable_ndb = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.975594] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.975754] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.mysql_wsrep_sync_wait = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.975945] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.pool_timeout = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.976137] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.retry_interval = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.976298] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.slave_connection = **** {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.976488] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.sqlite_synchronous = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.976698] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] database.use_db_reconnect = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.976916] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.backend = sqlalchemy {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.977158] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.connection = **** {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.977273] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.connection_debug = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.977440] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.connection_parameters = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.977602] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.connection_recycle_time = 3600 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.977766] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.connection_trace = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.977925] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.db_inc_retry_interval = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.978089] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.db_max_retries = 20 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.978247] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.db_max_retry_interval = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.978406] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.db_retry_interval = 1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.978571] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.max_overflow = 50 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.978732] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.max_pool_size = 5 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.978902] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.max_retries = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.979057] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.mysql_enable_ndb = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.979222] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.979376] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.979535] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.pool_timeout = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.979700] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.retry_interval = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.979856] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.slave_connection = **** {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.980028] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] api_database.sqlite_synchronous = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.980202] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] devices.enabled_mdev_types = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.980374] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.980533] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ephemeral_storage_encryption.enabled = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.980691] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.980880] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.api_servers = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.981045] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.cafile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.981204] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.certfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.981363] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.collect_timing = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.981518] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.connect_retries = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.981674] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.connect_retry_delay = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.981850] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.debug = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.981998] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.default_trusted_certificate_ids = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.982155] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.enable_certificate_validation = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.982313] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.enable_rbd_download = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.982467] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.endpoint_override = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.982627] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.insecure = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.982781] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.keyfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.982935] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.max_version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.983089] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.min_version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.983246] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.num_retries = 3 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.983409] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.rbd_ceph_conf = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.983566] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.rbd_connect_timeout = 5 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.983727] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.rbd_pool = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.983898] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.rbd_user = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.984065] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.region_name = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.984224] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.service_name = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.984386] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.service_type = image {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.984542] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.split_loggers = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.984695] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.status_code_retries = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.984849] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.status_code_retry_delay = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.985004] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.timeout = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.985178] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.985340] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.verify_glance_signatures = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.985494] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] glance.version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.985656] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] guestfs.debug = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.985823] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] hyperv.config_drive_cdrom = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.985984] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] hyperv.config_drive_inject_password = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.986147] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.986307] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] hyperv.enable_instance_metrics_collection = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.986465] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] hyperv.enable_remotefx = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.986628] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] hyperv.instances_path_share = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.986788] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] hyperv.iscsi_initiator_list = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.986949] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] hyperv.limit_cpu_features = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.987108] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.987265] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.987427] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] hyperv.power_state_check_timeframe = 60 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.987584] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.987747] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.987905] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] hyperv.use_multipath_io = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.988108] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] hyperv.volume_attach_retry_count = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.988280] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.988437] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] hyperv.vswitch_name = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.988596] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.988780] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] mks.enabled = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.989143] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.989336] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] image_cache.manager_interval = 2400 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.989505] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] image_cache.precache_concurrency = 1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.989672] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] image_cache.remove_unused_base_images = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.989839] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.990003] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.990173] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] image_cache.subdirectory_name = _base {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.990432] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.api_max_retries = 60 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.990527] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.api_retry_interval = 2 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.990664] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.auth_section = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.990819] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.auth_type = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.991001] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.cafile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.991157] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.certfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.991317] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.collect_timing = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.991471] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.connect_retries = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.991626] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.connect_retry_delay = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.991778] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.endpoint_override = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.991938] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.insecure = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.992112] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.keyfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.992270] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.max_version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.992424] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.min_version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.992579] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.partition_key = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.992739] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.peer_list = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.992893] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.region_name = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.993055] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.serial_console_state_timeout = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.993208] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.service_name = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.993372] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.service_type = baremetal {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.993581] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.split_loggers = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.993765] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.status_code_retries = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.993927] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.status_code_retry_delay = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.994086] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.timeout = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.994264] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.994421] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ironic.version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.994595] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.994763] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] key_manager.fixed_key = **** {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.994941] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.995101] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican.barbican_api_version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.995258] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican.barbican_endpoint = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.995425] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican.barbican_endpoint_type = public {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.995581] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican.barbican_region_name = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.995735] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican.cafile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.995888] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican.certfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.996095] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican.collect_timing = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.996263] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican.insecure = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.996417] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican.keyfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.996589] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican.number_of_retries = 60 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.996752] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican.retry_delay = 1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.996912] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican.send_service_user_token = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.997071] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican.split_loggers = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.997225] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican.timeout = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.997409] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican.verify_ssl = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.997583] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican.verify_ssl_path = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.997749] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican_service_user.auth_section = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.997919] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican_service_user.auth_type = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.998075] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican_service_user.cafile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.998227] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican_service_user.certfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.998386] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican_service_user.collect_timing = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.998544] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican_service_user.insecure = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.998901] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican_service_user.keyfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.999088] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican_service_user.split_loggers = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.999250] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] barbican_service_user.timeout = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.999417] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vault.approle_role_id = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.999575] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vault.approle_secret_id = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.999731] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vault.cafile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 557.999885] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vault.certfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.000064] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vault.collect_timing = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.000219] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vault.insecure = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.000373] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vault.keyfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.000538] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vault.kv_mountpoint = secret {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.000699] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vault.kv_version = 2 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.000867] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vault.namespace = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.001037] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vault.root_token_id = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.001199] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vault.split_loggers = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.001353] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vault.ssl_ca_crt_file = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.001506] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vault.timeout = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.001663] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vault.use_ssl = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.001825] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.002014] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] keystone.cafile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.002175] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] keystone.certfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.002337] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] keystone.collect_timing = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.002493] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] keystone.connect_retries = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.002649] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] keystone.connect_retry_delay = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.002801] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] keystone.endpoint_override = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.002958] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] keystone.insecure = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.003112] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] keystone.keyfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.003266] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] keystone.max_version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.003416] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] keystone.min_version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.003568] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] keystone.region_name = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.003720] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] keystone.service_name = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.003886] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] keystone.service_type = identity {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.004058] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] keystone.split_loggers = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.004217] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] keystone.status_code_retries = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.004374] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] keystone.status_code_retry_delay = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.004529] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] keystone.timeout = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.004704] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.004859] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] keystone.version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.005054] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.connection_uri = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.005211] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.cpu_mode = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.005375] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.cpu_model_extra_flags = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.005541] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.cpu_models = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.005705] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.cpu_power_governor_high = performance {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.005871] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.cpu_power_governor_low = powersave {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.006033] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.cpu_power_management = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.006199] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.006358] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.device_detach_attempts = 8 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.006517] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.device_detach_timeout = 20 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.006680] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.disk_cachemodes = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.006835] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.disk_prefix = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.006998] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.enabled_perf_events = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.007158] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.file_backed_memory = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.007316] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.gid_maps = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.007470] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.hw_disk_discard = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.007623] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.hw_machine_type = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.007786] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.images_rbd_ceph_conf = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.007944] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.008155] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.008327] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.images_rbd_glance_store_name = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.008491] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.images_rbd_pool = rbd {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.008655] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.images_type = default {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.008833] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.images_volume_group = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.009003] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.inject_key = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.009161] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.inject_partition = -2 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.009319] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.inject_password = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.009476] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.iscsi_iface = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.009632] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.iser_use_multipath = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.009788] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.live_migration_bandwidth = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.009946] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.010134] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.live_migration_downtime = 500 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.010302] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.010458] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.010613] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.live_migration_inbound_addr = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.010774] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.010961] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.live_migration_permit_post_copy = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.011128] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.live_migration_scheme = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.011294] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.live_migration_timeout_action = abort {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.011451] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.live_migration_tunnelled = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.011604] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.live_migration_uri = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.011761] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.live_migration_with_native_tls = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.011916] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.max_queues = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.012085] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.012241] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.nfs_mount_options = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.012542] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.012711] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.012870] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.num_iser_scan_tries = 5 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.013025] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.num_memory_encrypted_guests = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.013195] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.013365] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.num_pcie_ports = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.013526] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.num_volume_scan_tries = 5 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.013683] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.pmem_namespaces = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.013833] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.quobyte_client_cfg = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.014109] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.014276] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.rbd_connect_timeout = 5 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.014435] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.014591] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.014744] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.rbd_secret_uuid = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.014893] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.rbd_user = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.015049] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.015210] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.remote_filesystem_transport = ssh {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.015360] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.rescue_image_id = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.015510] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.rescue_kernel_id = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.015657] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.rescue_ramdisk_id = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.015814] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.015966] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.rx_queue_size = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.016141] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.smbfs_mount_options = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.016408] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.016573] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.snapshot_compression = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.016726] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.snapshot_image_format = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.016935] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.017095] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.sparse_logical_volumes = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.017264] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.swtpm_enabled = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.017491] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.swtpm_group = tss {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.017667] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.swtpm_user = tss {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.017833] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.sysinfo_serial = unique {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.017990] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.tx_queue_size = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.018148] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.uid_maps = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.018303] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.use_virtio_for_bridges = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.018466] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.virt_type = kvm {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.018624] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.volume_clear = zero {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.018801] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.volume_clear_size = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.018974] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.volume_use_multipath = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.019152] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.vzstorage_cache_path = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.019329] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.019490] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.vzstorage_mount_group = qemu {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.019648] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.vzstorage_mount_opts = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.019807] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.020108] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.020284] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.vzstorage_mount_user = stack {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.020445] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.020611] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.auth_section = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.020775] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.auth_type = password {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.020961] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.cafile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.021129] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.certfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.021289] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.collect_timing = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.021493] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.connect_retries = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.021672] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.connect_retry_delay = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.021848] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.default_floating_pool = public {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.022037] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.endpoint_override = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.022199] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.extension_sync_interval = 600 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.022354] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.http_retries = 3 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.022507] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.insecure = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.022660] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.keyfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.022809] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.max_version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.022971] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.023126] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.min_version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.023284] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.ovs_bridge = br-int {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.023439] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.physnets = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.023597] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.region_name = RegionOne {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.023756] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.service_metadata_proxy = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.023907] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.service_name = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.024081] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.service_type = network {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.024240] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.split_loggers = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.024391] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.status_code_retries = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.024543] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.status_code_retry_delay = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.024694] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.timeout = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.024864] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.025019] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] neutron.version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.025184] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] notifications.bdms_in_notifications = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.025350] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] notifications.default_level = INFO {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.025514] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] notifications.notification_format = unversioned {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.025668] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] notifications.notify_on_state_change = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.025834] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.026004] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] pci.alias = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.026168] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] pci.device_spec = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.026328] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] pci.report_in_placement = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.026491] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.auth_section = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.026657] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.auth_type = password {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.026816] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.026968] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.cafile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.027119] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.certfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.027272] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.collect_timing = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.027423] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.connect_retries = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.027572] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.connect_retry_delay = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.027720] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.default_domain_id = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.027872] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.default_domain_name = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.028033] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.domain_id = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.028183] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.domain_name = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.028333] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.endpoint_override = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.028484] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.insecure = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.028633] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.keyfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.028805] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.max_version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.028968] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.min_version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.029132] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.password = **** {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.029284] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.project_domain_id = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.029443] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.project_domain_name = Default {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.029601] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.project_id = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.029766] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.project_name = service {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.029929] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.region_name = RegionOne {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.030085] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.service_name = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.030246] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.service_type = placement {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.030404] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.split_loggers = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.030559] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.status_code_retries = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.030713] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.status_code_retry_delay = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.030887] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.system_scope = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.031057] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.timeout = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.031217] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.trust_id = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.031371] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.user_domain_id = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.031533] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.user_domain_name = Default {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.031688] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.user_id = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.031868] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.username = placement {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.032073] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.032239] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] placement.version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.032413] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] quota.cores = 20 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.032572] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] quota.count_usage_from_placement = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.032739] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.032907] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] quota.injected_file_content_bytes = 10240 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.033072] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] quota.injected_file_path_length = 255 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.033232] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] quota.injected_files = 5 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.033390] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] quota.instances = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.033545] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] quota.key_pairs = 100 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.033701] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] quota.metadata_items = 128 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.033861] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] quota.ram = 51200 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.034049] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] quota.recheck_quota = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.034216] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] quota.server_group_members = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.034374] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] quota.server_groups = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.034537] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] rdp.enabled = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.034844] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.035026] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.035189] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.035346] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] scheduler.image_metadata_prefilter = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.035502] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.035661] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] scheduler.max_attempts = 3 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.035818] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] scheduler.max_placement_results = 1000 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.035975] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.036148] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] scheduler.query_placement_for_availability_zone = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.036303] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] scheduler.query_placement_for_image_type_support = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.036457] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.036638] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] scheduler.workers = 2 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.036845] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.037055] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.037286] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.037490] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.037670] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.037837] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.038001] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.038195] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.038361] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.host_subset_size = 1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.038518] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.038712] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.038900] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.isolated_hosts = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.039125] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.isolated_images = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.039330] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.039514] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.039679] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.pci_in_placement = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.039840] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.040062] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.040177] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.040353] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.040546] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.040783] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.041003] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.track_instance_changes = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.041192] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.041364] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] metrics.required = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.041529] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] metrics.weight_multiplier = 1.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.041691] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.041871] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] metrics.weight_setting = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.042183] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.042356] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] serial_console.enabled = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.042529] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] serial_console.port_range = 10000:20000 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.042696] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.042863] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.043032] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] serial_console.serialproxy_port = 6083 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.043202] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] service_user.auth_section = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.043371] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] service_user.auth_type = password {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.043526] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] service_user.cafile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.043681] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] service_user.certfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.043840] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] service_user.collect_timing = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.044000] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] service_user.insecure = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.044194] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] service_user.keyfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.044362] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] service_user.send_service_user_token = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.044522] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] service_user.split_loggers = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.044676] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] service_user.timeout = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.044840] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] spice.agent_enabled = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.045018] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] spice.enabled = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.045308] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.045508] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.045681] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] spice.html5proxy_port = 6082 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.045841] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] spice.image_compression = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.046023] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] spice.jpeg_compression = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.046193] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] spice.playback_compression = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.046364] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] spice.server_listen = 127.0.0.1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.046530] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.046686] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] spice.streaming_mode = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.046839] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] spice.zlib_compression = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.047002] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] upgrade_levels.baseapi = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.047162] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] upgrade_levels.cert = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.047328] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] upgrade_levels.compute = auto {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.047486] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] upgrade_levels.conductor = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.047639] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] upgrade_levels.scheduler = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.047802] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vendordata_dynamic_auth.auth_section = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.047961] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vendordata_dynamic_auth.auth_type = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.048134] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vendordata_dynamic_auth.cafile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.048288] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vendordata_dynamic_auth.certfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.048450] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.048606] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vendordata_dynamic_auth.insecure = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.048762] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vendordata_dynamic_auth.keyfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.048919] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.049076] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vendordata_dynamic_auth.timeout = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.049244] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.api_retry_count = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.049399] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.ca_file = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.049564] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.cache_prefix = devstack-image-cache {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.049725] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.cluster_name = testcl1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.049886] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.connection_pool_size = 10 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.050042] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.console_delay_seconds = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.050204] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.datastore_regex = ^datastore.* {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.050403] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.050567] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.host_password = **** {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.050728] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.host_port = 443 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.050915] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.host_username = administrator@vsphere.local {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.051095] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.insecure = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.051255] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.integration_bridge = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.051416] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.maximum_objects = 100 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.051570] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.pbm_default_policy = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.051728] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.pbm_enabled = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.051909] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.pbm_wsdl_location = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.052107] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.052268] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.serial_port_proxy_uri = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.052423] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.serial_port_service_uri = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.052585] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.task_poll_interval = 0.5 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.052752] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.use_linked_clone = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.052915] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.vnc_keymap = en-us {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.053079] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.vnc_port = 5900 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.053239] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vmware.vnc_port_total = 10000 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.053417] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vnc.auth_schemes = ['none'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.053586] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vnc.enabled = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.053872] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.054055] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.054223] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vnc.novncproxy_port = 6080 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.054396] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vnc.server_listen = 127.0.0.1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.054565] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.054722] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vnc.vencrypt_ca_certs = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.054879] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vnc.vencrypt_client_cert = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.055034] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vnc.vencrypt_client_key = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.055209] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.055366] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.055522] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.055676] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.055833] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.disable_rootwrap = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.055990] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.enable_numa_live_migration = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.056162] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.056319] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.056473] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.056627] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.libvirt_disable_apic = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.056780] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.056937] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.057095] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.057252] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.057407] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.057562] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.057715] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.057868] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.058048] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.058224] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.058403] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.058568] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] wsgi.client_socket_timeout = 900 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.058731] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] wsgi.default_pool_size = 1000 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.058889] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] wsgi.keep_alive = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.059052] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] wsgi.max_header_line = 16384 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.059209] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] wsgi.secure_proxy_ssl_header = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.059365] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] wsgi.ssl_ca_file = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.059520] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] wsgi.ssl_cert_file = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.059676] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] wsgi.ssl_key_file = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.059836] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] wsgi.tcp_keepidle = 600 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.060010] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.060288] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] zvm.ca_file = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.060449] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] zvm.cloud_connector_url = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.060726] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.060919] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] zvm.reachable_timeout = 300 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.061108] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_policy.enforce_new_defaults = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.061279] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_policy.enforce_scope = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.061448] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_policy.policy_default_rule = default {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.061622] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.061789] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_policy.policy_file = policy.yaml {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.061991] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.062163] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.062431] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.062506] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.062620] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.062783] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.062952] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.063123] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] profiler.connection_string = messaging:// {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.063287] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] profiler.enabled = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.063688] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] profiler.es_doc_type = notification {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.063688] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] profiler.es_scroll_size = 10000 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.063787] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] profiler.es_scroll_time = 2m {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.063911] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] profiler.filter_error_trace = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.064088] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] profiler.hmac_keys = SECRET_KEY {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.064256] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] profiler.sentinel_service_name = mymaster {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.064536] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] profiler.socket_timeout = 0.1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.064678] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] profiler.trace_sqlalchemy = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.064731] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] remote_debug.host = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.064891] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] remote_debug.port = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.065069] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.065227] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.065384] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.065540] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.065694] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.065848] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.066004] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.066158] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.066312] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.066461] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.066622] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.066782] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.066944] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.067103] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.067258] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.067425] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.067580] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.067735] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.067893] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.068065] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.068222] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.068380] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.068536] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.068693] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.068854] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.069016] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.ssl = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.069183] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.069348] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.069505] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.069670] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.069835] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_rabbit.ssl_version = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.070043] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.070222] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_notifications.retry = -1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.070403] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.070570] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_messaging_notifications.transport_url = **** {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.070735] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.auth_section = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.070919] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.auth_type = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.071088] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.cafile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.071243] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.certfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.071399] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.collect_timing = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.071554] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.connect_retries = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.071708] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.connect_retry_delay = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.071876] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.endpoint_id = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.072061] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.endpoint_override = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.072226] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.insecure = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.072380] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.keyfile = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.072610] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.max_version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.072682] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.min_version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.072835] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.region_name = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.072993] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.service_name = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.073149] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.service_type = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.073305] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.split_loggers = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.073457] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.status_code_retries = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.073610] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.status_code_retry_delay = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.073836] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.timeout = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.074023] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.valid_interfaces = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.074223] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_limit.version = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.074344] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_reports.file_event_handler = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.074506] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.074657] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] oslo_reports.log_dir = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.074848] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.074980] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.075136] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.075291] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.075449] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.075601] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.075765] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.075918] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vif_plug_ovs_privileged.group = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.076092] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.076255] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.076499] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.076676] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] vif_plug_ovs_privileged.user = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.076847] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] os_vif_linux_bridge.flat_interface = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.077026] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.077210] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.077446] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.077622] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.077791] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.077953] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.078113] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.078285] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] os_vif_ovs.isolate_vif = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.078449] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.078609] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.078773] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.078937] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] os_vif_ovs.ovsdb_interface = native {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.079099] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] os_vif_ovs.per_port_bridge = False {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.079259] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] os_brick.lock_path = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.079423] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] privsep_osbrick.capabilities = [21] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.079576] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] privsep_osbrick.group = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.079728] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] privsep_osbrick.helper_command = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.079924] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.080049] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.080209] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] privsep_osbrick.user = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.080434] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.080613] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] nova_sys_admin.group = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.080769] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] nova_sys_admin.helper_command = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.080957] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.081128] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.081283] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] nova_sys_admin.user = None {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 558.081409] env[59518]: DEBUG oslo_service.service [None req-ddc87f7a-a323-4cf4-93bd-bf01115755a9 None None] ******************************************************************************** {{(pid=59518) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 558.081806] env[59518]: INFO nova.service [-] Starting compute node (version 0.1.0) [ 558.090554] env[59518]: INFO nova.virt.node [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Generated node identity ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd [ 558.090804] env[59518]: INFO nova.virt.node [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Wrote node identity ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd to /opt/stack/data/n-cpu-1/compute_id [ 558.102543] env[59518]: WARNING nova.compute.manager [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Compute nodes ['ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 558.134025] env[59518]: INFO nova.compute.manager [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 558.158383] env[59518]: WARNING nova.compute.manager [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 558.158616] env[59518]: DEBUG oslo_concurrency.lockutils [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 558.158826] env[59518]: DEBUG oslo_concurrency.lockutils [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 558.159000] env[59518]: DEBUG oslo_concurrency.lockutils [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 558.159147] env[59518]: DEBUG nova.compute.resource_tracker [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59518) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 558.160268] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24969ace-06e4-451c-8a4d-d76617029498 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.168829] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74209394-e5e7-450f-9030-d0f82d67557d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.184058] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b913d99-2219-4290-a29a-ad4b08bcfaf6 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.190736] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0abb88d8-c3fe-4801-aea8-35e4bf43fcdc {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.219262] env[59518]: DEBUG nova.compute.resource_tracker [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181784MB free_disk=174GB free_vcpus=48 pci_devices=None {{(pid=59518) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 558.219431] env[59518]: DEBUG oslo_concurrency.lockutils [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 558.219581] env[59518]: DEBUG oslo_concurrency.lockutils [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 558.231628] env[59518]: WARNING nova.compute.resource_tracker [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] No compute node record for cpu-1:ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd could not be found. [ 558.244610] env[59518]: INFO nova.compute.resource_tracker [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd [ 558.290795] env[59518]: DEBUG nova.compute.resource_tracker [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 558.290964] env[59518]: DEBUG nova.compute.resource_tracker [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 558.383257] env[59518]: INFO nova.scheduler.client.report [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] [req-d18632f6-126a-4f7a-a0ac-73c197b91374] Created resource provider record via placement API for resource provider with UUID ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 558.399057] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3caec07c-8a52-4045-bce0-a9fe11678fdb {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.406401] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16793ebe-2982-4d46-b071-6b0df2df6e30 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.435487] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09602edd-b145-41f0-be9c-dd81c95eb67b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.442015] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d45e5e3-98d8-4573-9d31-8d1cba6e87f7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.454219] env[59518]: DEBUG nova.compute.provider_tree [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Updating inventory in ProviderTree for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 558.489603] env[59518]: DEBUG nova.scheduler.client.report [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Updated inventory for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 558.489814] env[59518]: DEBUG nova.compute.provider_tree [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Updating resource provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd generation from 0 to 1 during operation: update_inventory {{(pid=59518) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 558.489947] env[59518]: DEBUG nova.compute.provider_tree [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Updating inventory in ProviderTree for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 558.529614] env[59518]: DEBUG nova.compute.provider_tree [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Updating resource provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd generation from 1 to 2 during operation: update_traits {{(pid=59518) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 558.547920] env[59518]: DEBUG nova.compute.resource_tracker [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59518) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 558.548098] env[59518]: DEBUG oslo_concurrency.lockutils [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.328s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 558.548250] env[59518]: DEBUG nova.service [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Creating RPC server for service compute {{(pid=59518) start /opt/stack/nova/nova/service.py:182}} [ 558.560953] env[59518]: DEBUG nova.service [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] Join ServiceGroup membership for this service compute {{(pid=59518) start /opt/stack/nova/nova/service.py:199}} [ 558.561141] env[59518]: DEBUG nova.servicegroup.drivers.db [None req-653ceaa5-04bb-4f69-926a-008272ea74fa None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=59518) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 567.919459] env[59518]: DEBUG dbcounter [-] [59518] Writing DB stats nova_cell1:SELECT=1 {{(pid=59518) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 567.920062] env[59518]: DEBUG dbcounter [-] [59518] Writing DB stats nova_cell0:SELECT=1 {{(pid=59518) stat_writer /usr/local/lib/python3.10/dist-packages/dbcounter.py:114}} [ 600.506216] env[59518]: DEBUG oslo_concurrency.lockutils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Acquiring lock "bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 600.506506] env[59518]: DEBUG oslo_concurrency.lockutils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Lock "bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 600.520934] env[59518]: DEBUG nova.compute.manager [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 600.614283] env[59518]: DEBUG oslo_concurrency.lockutils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 600.614530] env[59518]: DEBUG oslo_concurrency.lockutils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 600.616102] env[59518]: INFO nova.compute.claims [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 600.741204] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f09c7d8c-6027-4c82-9562-996f1d6acb79 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.749873] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f5f7b64-a544-423e-9780-bc88b04bbb85 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.780833] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c449c80a-72a9-4b85-9260-84fd34983877 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.788406] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f45c6e7-9d31-4f4d-97e0-148ac62a42b3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.801929] env[59518]: DEBUG nova.compute.provider_tree [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 600.810205] env[59518]: DEBUG nova.scheduler.client.report [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 600.826450] env[59518]: DEBUG oslo_concurrency.lockutils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.212s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 600.827049] env[59518]: DEBUG nova.compute.manager [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 600.873866] env[59518]: DEBUG nova.compute.utils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 600.876162] env[59518]: DEBUG nova.compute.manager [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Allocating IP information in the background. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 600.876803] env[59518]: DEBUG nova.network.neutron [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] allocate_for_instance() {{(pid=59518) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 600.888134] env[59518]: DEBUG nova.compute.manager [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 600.967917] env[59518]: DEBUG nova.compute.manager [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 600.981646] env[59518]: DEBUG oslo_concurrency.lockutils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquiring lock "21f44d88-a868-4765-95f8-8dbe8eccef7a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 600.981881] env[59518]: DEBUG oslo_concurrency.lockutils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Lock "21f44d88-a868-4765-95f8-8dbe8eccef7a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 601.000651] env[59518]: DEBUG nova.compute.manager [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 601.052653] env[59518]: DEBUG oslo_concurrency.lockutils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 601.052887] env[59518]: DEBUG oslo_concurrency.lockutils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 601.054600] env[59518]: INFO nova.compute.claims [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 601.151148] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0686acd6-1b8c-4109-b250-7f03d500702b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.161942] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d577773-8748-4165-a2c9-a8f25e38b99c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.201894] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff24a180-6861-4523-9e80-79d21903fb2e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.211629] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d61ed57-e587-467c-97ec-20bb8461e3d7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.227613] env[59518]: DEBUG nova.compute.provider_tree [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 601.240377] env[59518]: DEBUG nova.scheduler.client.report [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 601.255044] env[59518]: DEBUG oslo_concurrency.lockutils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.202s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 601.257611] env[59518]: DEBUG nova.compute.manager [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 601.297574] env[59518]: DEBUG nova.compute.utils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 601.303905] env[59518]: DEBUG nova.compute.manager [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Allocating IP information in the background. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 601.305162] env[59518]: DEBUG nova.network.neutron [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] allocate_for_instance() {{(pid=59518) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 601.311179] env[59518]: DEBUG nova.compute.manager [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 601.383556] env[59518]: DEBUG nova.compute.manager [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 601.488784] env[59518]: DEBUG nova.virt.hardware [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 601.489517] env[59518]: DEBUG nova.virt.hardware [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 601.489517] env[59518]: DEBUG nova.virt.hardware [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 601.489517] env[59518]: DEBUG nova.virt.hardware [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 601.489661] env[59518]: DEBUG nova.virt.hardware [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 601.489795] env[59518]: DEBUG nova.virt.hardware [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 601.490041] env[59518]: DEBUG nova.virt.hardware [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 601.490232] env[59518]: DEBUG nova.virt.hardware [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 601.490843] env[59518]: DEBUG nova.virt.hardware [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 601.491092] env[59518]: DEBUG nova.virt.hardware [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 601.491308] env[59518]: DEBUG nova.virt.hardware [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 601.492289] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1514532-55aa-40fb-aaf6-7ae35ae85d57 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.500588] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7907fbf-8d1d-444f-a6e4-440402ec1c67 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.517565] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-836ce1b0-bd39-4298-9ad4-1857d8c1c42b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.558273] env[59518]: DEBUG nova.policy [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7d8cc34e55ad4fc7a7238e1058056efa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e48ea78a3aa548d89738c45e00618d64', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59518) authorize /opt/stack/nova/nova/policy.py:203}} [ 601.758541] env[59518]: DEBUG nova.policy [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '34d1251a0db64dc7a4a20085390672a3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7b766a9205774740bbff73e46bd3b905', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59518) authorize /opt/stack/nova/nova/policy.py:203}} [ 601.839953] env[59518]: DEBUG nova.virt.hardware [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 601.840202] env[59518]: DEBUG nova.virt.hardware [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 601.840702] env[59518]: DEBUG nova.virt.hardware [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 601.840702] env[59518]: DEBUG nova.virt.hardware [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 601.840702] env[59518]: DEBUG nova.virt.hardware [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 601.840845] env[59518]: DEBUG nova.virt.hardware [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 601.840988] env[59518]: DEBUG nova.virt.hardware [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 601.841156] env[59518]: DEBUG nova.virt.hardware [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 601.841299] env[59518]: DEBUG nova.virt.hardware [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 601.841446] env[59518]: DEBUG nova.virt.hardware [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 601.841947] env[59518]: DEBUG nova.virt.hardware [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 601.842589] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b23baa6-0359-416b-9854-d993f8aaa3b9 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.851108] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7789da5b-9a22-467a-9a41-c45c579ba5f6 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 602.462958] env[59518]: DEBUG nova.network.neutron [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Successfully created port: b13fae5b-bd06-45f6-8618-191c68992381 {{(pid=59518) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 602.556761] env[59518]: DEBUG nova.network.neutron [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Successfully created port: 0bbc769a-e7ad-418c-93dc-6a5cd658e622 {{(pid=59518) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 603.366070] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Acquiring lock "b037b116-3b8c-4f10-990c-a855f96fa61c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.366428] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Lock "b037b116-3b8c-4f10-990c-a855f96fa61c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.385946] env[59518]: DEBUG nova.compute.manager [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 603.459476] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.459476] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.459476] env[59518]: INFO nova.compute.claims [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 603.592021] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97da3b3b-291f-4bc7-96b5-35384c8bbedc {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.600189] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8802aa2c-ba2f-4544-b150-d0d2ff11f264 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.636232] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-020c3abe-50b6-434a-a9cb-3714ee5e10c5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.644614] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7f8e2e3-9199-459f-92bf-e285509fb31c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.662069] env[59518]: DEBUG nova.compute.provider_tree [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 603.671364] env[59518]: DEBUG nova.scheduler.client.report [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 603.690602] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.235s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.691354] env[59518]: DEBUG nova.compute.manager [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 603.738568] env[59518]: DEBUG nova.compute.utils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 603.739602] env[59518]: DEBUG nova.compute.manager [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Not allocating networking since 'none' was specified. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 603.755910] env[59518]: DEBUG nova.compute.manager [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 603.854750] env[59518]: DEBUG nova.compute.manager [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 603.879966] env[59518]: DEBUG nova.virt.hardware [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 603.880246] env[59518]: DEBUG nova.virt.hardware [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 603.880362] env[59518]: DEBUG nova.virt.hardware [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 603.880540] env[59518]: DEBUG nova.virt.hardware [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 603.880676] env[59518]: DEBUG nova.virt.hardware [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 603.880814] env[59518]: DEBUG nova.virt.hardware [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 603.881050] env[59518]: DEBUG nova.virt.hardware [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 603.881200] env[59518]: DEBUG nova.virt.hardware [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 603.881355] env[59518]: DEBUG nova.virt.hardware [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 603.881503] env[59518]: DEBUG nova.virt.hardware [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 603.881656] env[59518]: DEBUG nova.virt.hardware [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 603.882524] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f49df9bc-20c1-4a2e-a5a7-1abf15a184d6 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.891091] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3d7553f-3aaf-4a5f-b7a7-993d6e7939a7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.906303] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Instance VIF info [] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 603.915687] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 603.916033] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1ab3a0b7-42ae-4ad1-a87c-f70b6e5b517b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.928334] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Created folder: OpenStack in parent group-v4. [ 603.928508] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Creating folder: Project (68c97409e4b5436da7b7018d8b91bf38). Parent ref: group-v88807. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 603.929244] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d4e01663-bf49-4299-8e58-7949d7c20b51 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.940291] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Created folder: Project (68c97409e4b5436da7b7018d8b91bf38) in parent group-v88807. [ 603.940291] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Creating folder: Instances. Parent ref: group-v88808. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 603.942301] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a3a9b879-dff4-4fdd-9d2e-f251e3ef6e94 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.954610] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Created folder: Instances in parent group-v88808. [ 603.954951] env[59518]: DEBUG oslo.service.loopingcall [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 603.955168] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 603.955366] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e726d29a-ca2d-48e4-abcc-aa3e4e1952bb {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.974072] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 603.974072] env[59518]: value = "task-307905" [ 603.974072] env[59518]: _type = "Task" [ 603.974072] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 603.988390] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307905, 'name': CreateVM_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 604.387430] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Acquiring lock "6ffd468f-b92f-45ae-834f-6daac20937ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.387675] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Lock "6ffd468f-b92f-45ae-834f-6daac20937ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.401601] env[59518]: DEBUG nova.compute.manager [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 604.468041] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.468041] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.468041] env[59518]: INFO nova.compute.claims [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 604.483648] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307905, 'name': CreateVM_Task, 'duration_secs': 0.271031} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 604.483797] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 604.485117] env[59518]: DEBUG oslo_vmware.service [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dda85dcf-364c-4d34-b0cf-632cebf8912c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.492008] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 604.492177] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 604.492791] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 604.493033] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6bc068f1-3d2d-4baa-b361-fd8b6e53aee9 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.497556] env[59518]: DEBUG oslo_vmware.api [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Waiting for the task: (returnval){ [ 604.497556] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]522b7fd9-28fb-d6cd-f001-a8e1bdc20fbf" [ 604.497556] env[59518]: _type = "Task" [ 604.497556] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 604.505120] env[59518]: DEBUG oslo_vmware.api [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]522b7fd9-28fb-d6cd-f001-a8e1bdc20fbf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 604.616078] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c430bfd2-1846-476e-9c74-640e38f41144 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.623387] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5232cc5-3077-4ca5-8cf3-a4e7002a7c50 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.661468] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4d1e12b-882a-4204-b46b-72888bb01d48 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.670348] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16efc340-c0dc-4b35-9c78-149103162c02 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.685766] env[59518]: DEBUG nova.compute.provider_tree [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 604.694139] env[59518]: DEBUG nova.scheduler.client.report [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 604.711781] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.247s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.712405] env[59518]: DEBUG nova.compute.manager [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 604.748787] env[59518]: DEBUG nova.compute.utils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 604.749142] env[59518]: DEBUG nova.compute.manager [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Allocating IP information in the background. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 604.749745] env[59518]: DEBUG nova.network.neutron [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] allocate_for_instance() {{(pid=59518) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 604.759438] env[59518]: DEBUG nova.compute.manager [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 604.818861] env[59518]: DEBUG nova.policy [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e474a038cbe74633867e6a7d0718927f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd18e261ab4ee4cc4a05a3c624639dbce', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59518) authorize /opt/stack/nova/nova/policy.py:203}} [ 604.868547] env[59518]: DEBUG nova.compute.manager [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 604.890004] env[59518]: DEBUG nova.virt.hardware [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 604.890229] env[59518]: DEBUG nova.virt.hardware [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 604.890382] env[59518]: DEBUG nova.virt.hardware [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 604.890538] env[59518]: DEBUG nova.virt.hardware [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 604.890671] env[59518]: DEBUG nova.virt.hardware [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 604.890806] env[59518]: DEBUG nova.virt.hardware [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 604.891057] env[59518]: DEBUG nova.virt.hardware [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 604.891210] env[59518]: DEBUG nova.virt.hardware [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 604.891366] env[59518]: DEBUG nova.virt.hardware [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 604.891517] env[59518]: DEBUG nova.virt.hardware [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 604.891675] env[59518]: DEBUG nova.virt.hardware [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 604.892580] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd0015a3-0d54-42e3-a4da-bf780bf90df5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.900955] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4916b0d-b5d5-4e62-9b76-65198a45c22c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.009304] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 605.009468] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 605.009668] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 605.009801] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 605.010224] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 605.010463] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e20810a3-d73d-4210-abde-5c7006efccb3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.018838] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 605.019049] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 605.021460] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6cf9a9f-d45d-4cbb-8193-b6aa9a041bdf {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.026423] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-31b1f7bd-a1e0-4324-862b-18f6078bc62a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.031933] env[59518]: DEBUG oslo_vmware.api [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Waiting for the task: (returnval){ [ 605.031933] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52386a36-d874-50a7-7945-849a846d34d8" [ 605.031933] env[59518]: _type = "Task" [ 605.031933] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 605.059447] env[59518]: DEBUG oslo_vmware.api [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52386a36-d874-50a7-7945-849a846d34d8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 605.082258] env[59518]: DEBUG oslo_concurrency.lockutils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquiring lock "37d863b9-bfcb-4d1f-b99b-832276bd640f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.082473] env[59518]: DEBUG oslo_concurrency.lockutils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Lock "37d863b9-bfcb-4d1f-b99b-832276bd640f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.096199] env[59518]: DEBUG nova.compute.manager [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 605.156246] env[59518]: DEBUG oslo_concurrency.lockutils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.156496] env[59518]: DEBUG oslo_concurrency.lockutils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.157933] env[59518]: INFO nova.compute.claims [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 605.313151] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c6547e4-8ccc-41ce-ab24-ff59e7cdfe02 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.321489] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c742a5c-ea83-4e70-ae23-bcb224976901 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.354999] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6fc4725-8666-495d-80e1-0186eeca960b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.362728] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4f8567b-488e-45ad-a6f5-5e0db40df701 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.383554] env[59518]: DEBUG nova.compute.provider_tree [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 605.393294] env[59518]: DEBUG nova.scheduler.client.report [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 605.407053] env[59518]: DEBUG oslo_concurrency.lockutils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.250s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.407538] env[59518]: DEBUG nova.compute.manager [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 605.443850] env[59518]: DEBUG nova.compute.utils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 605.445117] env[59518]: DEBUG nova.compute.manager [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Allocating IP information in the background. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 605.445278] env[59518]: DEBUG nova.network.neutron [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] allocate_for_instance() {{(pid=59518) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 605.454524] env[59518]: DEBUG nova.compute.manager [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 605.505859] env[59518]: DEBUG nova.network.neutron [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Successfully updated port: 0bbc769a-e7ad-418c-93dc-6a5cd658e622 {{(pid=59518) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 605.517020] env[59518]: DEBUG oslo_concurrency.lockutils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquiring lock "refresh_cache-21f44d88-a868-4765-95f8-8dbe8eccef7a" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 605.517155] env[59518]: DEBUG oslo_concurrency.lockutils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquired lock "refresh_cache-21f44d88-a868-4765-95f8-8dbe8eccef7a" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 605.517298] env[59518]: DEBUG nova.network.neutron [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 605.541705] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 605.541960] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Creating directory with path [datastore1] vmware_temp/32908a3e-d6cd-4028-8baf-7ef1b8ddd868/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 605.542190] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-08a99bd7-eb9a-4cd8-bf2f-df1fb79c685d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.549181] env[59518]: DEBUG nova.compute.manager [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 605.569977] env[59518]: DEBUG nova.network.neutron [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Successfully updated port: b13fae5b-bd06-45f6-8618-191c68992381 {{(pid=59518) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 605.577780] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Created directory with path [datastore1] vmware_temp/32908a3e-d6cd-4028-8baf-7ef1b8ddd868/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 605.578009] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Fetch image to [datastore1] vmware_temp/32908a3e-d6cd-4028-8baf-7ef1b8ddd868/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 605.578133] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/32908a3e-d6cd-4028-8baf-7ef1b8ddd868/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 605.579131] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-357e3ca5-efd8-4211-8f96-0d509e25e9bc {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.583880] env[59518]: DEBUG oslo_concurrency.lockutils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Acquiring lock "refresh_cache-bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 605.583998] env[59518]: DEBUG oslo_concurrency.lockutils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Acquired lock "refresh_cache-bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 605.584180] env[59518]: DEBUG nova.network.neutron [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 605.586993] env[59518]: DEBUG nova.virt.hardware [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 605.587189] env[59518]: DEBUG nova.virt.hardware [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 605.587329] env[59518]: DEBUG nova.virt.hardware [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 605.587494] env[59518]: DEBUG nova.virt.hardware [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 605.587625] env[59518]: DEBUG nova.virt.hardware [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 605.587757] env[59518]: DEBUG nova.virt.hardware [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 605.587959] env[59518]: DEBUG nova.virt.hardware [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 605.588118] env[59518]: DEBUG nova.virt.hardware [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 605.588275] env[59518]: DEBUG nova.virt.hardware [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 605.588426] env[59518]: DEBUG nova.virt.hardware [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 605.588585] env[59518]: DEBUG nova.virt.hardware [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 605.592993] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9acb764-df10-4c77-9a64-0363b99f85e5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.606325] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18567f99-ff58-4d17-bc03-89d529885fad {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.623057] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-311a5956-bd5e-4839-a7a1-7b65b92f0c29 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.635646] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbdefb29-60ef-43ba-947a-0ba87c5174e5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.683408] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6bb10dc-5c64-4478-a8f6-cbbf63d3d3af {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.690785] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d9b481c0-6ce8-40f1-aa5c-85fc00997282 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.694657] env[59518]: DEBUG nova.policy [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5b893dfae76248ec98ab38c6abb6047c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd3476ab8778c40218c4b2b54e1297f19', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59518) authorize /opt/stack/nova/nova/policy.py:203}} [ 605.696436] env[59518]: DEBUG nova.network.neutron [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 605.734397] env[59518]: DEBUG nova.network.neutron [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 605.787962] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 605.860614] env[59518]: DEBUG oslo_vmware.rw_handles [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/32908a3e-d6cd-4028-8baf-7ef1b8ddd868/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 605.931876] env[59518]: DEBUG oslo_vmware.rw_handles [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Completed reading data from the image iterator. {{(pid=59518) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 605.932046] env[59518]: DEBUG oslo_vmware.rw_handles [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/32908a3e-d6cd-4028-8baf-7ef1b8ddd868/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 606.079174] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Acquiring lock "a894a8af-52b8-4b1c-a5ea-2469f06ea17a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 606.079566] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Lock "a894a8af-52b8-4b1c-a5ea-2469f06ea17a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 606.094914] env[59518]: DEBUG nova.compute.manager [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 606.163426] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 606.163667] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 606.165188] env[59518]: INFO nova.compute.claims [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 606.398242] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b37511b-bbb5-4965-9494-6f6727a8ad47 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.409377] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6afb274d-537a-4f2d-aad2-23e86f78c2ec {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.451387] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5deac76a-6b2f-4a18-9a31-85945a4c82bd {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.469915] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed3a5cf1-2faf-4293-97ae-aa4cc2735d33 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.490492] env[59518]: DEBUG nova.compute.provider_tree [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 606.500982] env[59518]: DEBUG nova.scheduler.client.report [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 606.517222] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.353s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 606.517738] env[59518]: DEBUG nova.compute.manager [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 606.571055] env[59518]: DEBUG nova.compute.utils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 606.572529] env[59518]: DEBUG nova.compute.manager [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Allocating IP information in the background. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 606.572690] env[59518]: DEBUG nova.network.neutron [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] allocate_for_instance() {{(pid=59518) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 606.587826] env[59518]: DEBUG nova.compute.manager [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 606.649724] env[59518]: DEBUG nova.network.neutron [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Updating instance_info_cache with network_info: [{"id": "0bbc769a-e7ad-418c-93dc-6a5cd658e622", "address": "fa:16:3e:31:b5:ee", "network": {"id": "d9098ce0-cf4b-4e0f-b9b7-b613f8d5b1d5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.82", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "33bab75ff2cf45ecb4ab54af3adf83ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2", "external-id": "nsx-vlan-transportzone-546", "segmentation_id": 546, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0bbc769a-e7", "ovs_interfaceid": "0bbc769a-e7ad-418c-93dc-6a5cd658e622", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 606.665485] env[59518]: DEBUG oslo_concurrency.lockutils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Releasing lock "refresh_cache-21f44d88-a868-4765-95f8-8dbe8eccef7a" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 606.665787] env[59518]: DEBUG nova.compute.manager [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Instance network_info: |[{"id": "0bbc769a-e7ad-418c-93dc-6a5cd658e622", "address": "fa:16:3e:31:b5:ee", "network": {"id": "d9098ce0-cf4b-4e0f-b9b7-b613f8d5b1d5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.82", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "33bab75ff2cf45ecb4ab54af3adf83ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2", "external-id": "nsx-vlan-transportzone-546", "segmentation_id": 546, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0bbc769a-e7", "ovs_interfaceid": "0bbc769a-e7ad-418c-93dc-6a5cd658e622", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 606.666231] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:31:b5:ee', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0bbc769a-e7ad-418c-93dc-6a5cd658e622', 'vif_model': 'vmxnet3'}] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 606.675732] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Creating folder: Project (7b766a9205774740bbff73e46bd3b905). Parent ref: group-v88807. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 606.676400] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-56c77874-ade7-40b8-aeb2-d8907d7f1244 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.689397] env[59518]: DEBUG nova.network.neutron [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Successfully created port: 8a7ecefa-7677-47e3-aa52-02d74d7b8a40 {{(pid=59518) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 606.695640] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Created folder: Project (7b766a9205774740bbff73e46bd3b905) in parent group-v88807. [ 606.695887] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Creating folder: Instances. Parent ref: group-v88811. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 606.696906] env[59518]: DEBUG nova.compute.manager [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 606.699258] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-deb0dd29-42c5-4d32-8cae-e336f3b417a7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.710894] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Created folder: Instances in parent group-v88811. [ 606.711149] env[59518]: DEBUG oslo.service.loopingcall [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 606.711323] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 606.711515] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d211b5ec-e357-4c5c-a656-51c9c7ee47dd {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.732300] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 606.732300] env[59518]: value = "task-307908" [ 606.732300] env[59518]: _type = "Task" [ 606.732300] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 606.734636] env[59518]: DEBUG nova.virt.hardware [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 606.734841] env[59518]: DEBUG nova.virt.hardware [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 606.734985] env[59518]: DEBUG nova.virt.hardware [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 606.735154] env[59518]: DEBUG nova.virt.hardware [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 606.735288] env[59518]: DEBUG nova.virt.hardware [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 606.735424] env[59518]: DEBUG nova.virt.hardware [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 606.736375] env[59518]: DEBUG nova.virt.hardware [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 606.736572] env[59518]: DEBUG nova.virt.hardware [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 606.736744] env[59518]: DEBUG nova.virt.hardware [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 606.736908] env[59518]: DEBUG nova.virt.hardware [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 606.737079] env[59518]: DEBUG nova.virt.hardware [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 606.737911] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04f14a12-8df4-4e31-9aa1-28758f8052b4 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.751642] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307908, 'name': CreateVM_Task} progress is 6%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 606.752961] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d215459-b4c8-42a8-a338-c198133ebdf5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.842858] env[59518]: DEBUG nova.network.neutron [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Updating instance_info_cache with network_info: [{"id": "b13fae5b-bd06-45f6-8618-191c68992381", "address": "fa:16:3e:56:61:a2", "network": {"id": "f82b2678-6625-457a-bc6c-6a4d9004014e", "bridge": "br-int", "label": "tempest-ImagesTestJSON-812614802-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e48ea78a3aa548d89738c45e00618d64", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e23c1d18-c841-49ea-95f3-df5ceac28afd", "external-id": "nsx-vlan-transportzone-774", "segmentation_id": 774, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb13fae5b-bd", "ovs_interfaceid": "b13fae5b-bd06-45f6-8618-191c68992381", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 606.856088] env[59518]: DEBUG oslo_concurrency.lockutils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Releasing lock "refresh_cache-bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 606.856218] env[59518]: DEBUG nova.compute.manager [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Instance network_info: |[{"id": "b13fae5b-bd06-45f6-8618-191c68992381", "address": "fa:16:3e:56:61:a2", "network": {"id": "f82b2678-6625-457a-bc6c-6a4d9004014e", "bridge": "br-int", "label": "tempest-ImagesTestJSON-812614802-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e48ea78a3aa548d89738c45e00618d64", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e23c1d18-c841-49ea-95f3-df5ceac28afd", "external-id": "nsx-vlan-transportzone-774", "segmentation_id": 774, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb13fae5b-bd", "ovs_interfaceid": "b13fae5b-bd06-45f6-8618-191c68992381", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 606.856573] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:56:61:a2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e23c1d18-c841-49ea-95f3-df5ceac28afd', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b13fae5b-bd06-45f6-8618-191c68992381', 'vif_model': 'vmxnet3'}] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 606.863881] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Creating folder: Project (e48ea78a3aa548d89738c45e00618d64). Parent ref: group-v88807. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 606.864504] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bf19ca62-0139-4bd4-9f27-753a83f84602 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.874263] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Created folder: Project (e48ea78a3aa548d89738c45e00618d64) in parent group-v88807. [ 606.874481] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Creating folder: Instances. Parent ref: group-v88814. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 606.874654] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-969d06dd-52f0-4304-8c4a-0e766a0f3b0a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.883894] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Created folder: Instances in parent group-v88814. [ 606.884586] env[59518]: DEBUG oslo.service.loopingcall [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 606.884586] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 606.884586] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-aa53568a-11ef-4dd8-b63c-fb8f9639c4c8 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.903053] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 606.903053] env[59518]: value = "task-307911" [ 606.903053] env[59518]: _type = "Task" [ 606.903053] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 606.912189] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307911, 'name': CreateVM_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 607.054656] env[59518]: DEBUG nova.policy [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c15418d98d47457fb0738efcfb6b1e1d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8058037c59b4f388e4cac07cdf8be1d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59518) authorize /opt/stack/nova/nova/policy.py:203}} [ 607.250684] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307908, 'name': CreateVM_Task, 'duration_secs': 0.424433} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 607.250890] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 607.259200] env[59518]: DEBUG nova.compute.manager [req-86115ace-29a8-44fd-9564-e89315fbf640 req-55f184f6-3c68-46c9-b08b-5b594aeb688b service nova] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Received event network-vif-plugged-0bbc769a-e7ad-418c-93dc-6a5cd658e622 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 607.259314] env[59518]: DEBUG oslo_concurrency.lockutils [req-86115ace-29a8-44fd-9564-e89315fbf640 req-55f184f6-3c68-46c9-b08b-5b594aeb688b service nova] Acquiring lock "21f44d88-a868-4765-95f8-8dbe8eccef7a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 607.259450] env[59518]: DEBUG oslo_concurrency.lockutils [req-86115ace-29a8-44fd-9564-e89315fbf640 req-55f184f6-3c68-46c9-b08b-5b594aeb688b service nova] Lock "21f44d88-a868-4765-95f8-8dbe8eccef7a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 607.259612] env[59518]: DEBUG oslo_concurrency.lockutils [req-86115ace-29a8-44fd-9564-e89315fbf640 req-55f184f6-3c68-46c9-b08b-5b594aeb688b service nova] Lock "21f44d88-a868-4765-95f8-8dbe8eccef7a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 607.259776] env[59518]: DEBUG nova.compute.manager [req-86115ace-29a8-44fd-9564-e89315fbf640 req-55f184f6-3c68-46c9-b08b-5b594aeb688b service nova] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] No waiting events found dispatching network-vif-plugged-0bbc769a-e7ad-418c-93dc-6a5cd658e622 {{(pid=59518) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 607.259927] env[59518]: WARNING nova.compute.manager [req-86115ace-29a8-44fd-9564-e89315fbf640 req-55f184f6-3c68-46c9-b08b-5b594aeb688b service nova] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Received unexpected event network-vif-plugged-0bbc769a-e7ad-418c-93dc-6a5cd658e622 for instance with vm_state building and task_state spawning. [ 607.278043] env[59518]: DEBUG oslo_concurrency.lockutils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 607.278212] env[59518]: DEBUG oslo_concurrency.lockutils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 607.278529] env[59518]: DEBUG oslo_concurrency.lockutils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 607.278797] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-df00182b-601b-4665-a188-2715ec9d350f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.284764] env[59518]: DEBUG oslo_vmware.api [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Waiting for the task: (returnval){ [ 607.284764] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52ec5201-fb7e-1112-e581-e986c4c0ebcd" [ 607.284764] env[59518]: _type = "Task" [ 607.284764] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 607.292570] env[59518]: DEBUG oslo_vmware.api [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52ec5201-fb7e-1112-e581-e986c4c0ebcd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 607.416862] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307911, 'name': CreateVM_Task, 'duration_secs': 0.343547} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 607.417362] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 607.418508] env[59518]: DEBUG oslo_concurrency.lockutils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 607.789317] env[59518]: DEBUG nova.network.neutron [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Successfully created port: e2be4581-77a6-4a18-8394-62cc4710988c {{(pid=59518) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 607.799273] env[59518]: DEBUG oslo_concurrency.lockutils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 607.799273] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 607.799273] env[59518]: DEBUG oslo_concurrency.lockutils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 607.799273] env[59518]: DEBUG oslo_concurrency.lockutils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 607.800053] env[59518]: DEBUG oslo_concurrency.lockutils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 607.800280] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-784df642-726d-46e7-9b07-47e6ccfc0f07 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.807240] env[59518]: DEBUG oslo_vmware.api [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Waiting for the task: (returnval){ [ 607.807240] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52da046c-a3ed-5b5e-a3fd-5faaeb602341" [ 607.807240] env[59518]: _type = "Task" [ 607.807240] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 607.815387] env[59518]: DEBUG oslo_vmware.api [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52da046c-a3ed-5b5e-a3fd-5faaeb602341, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 608.315682] env[59518]: DEBUG oslo_concurrency.lockutils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 608.315914] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 608.318499] env[59518]: DEBUG oslo_concurrency.lockutils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 608.699259] env[59518]: DEBUG nova.network.neutron [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Successfully created port: 40b02e48-3fdf-4f44-9c50-03f4844f6ce3 {{(pid=59518) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 610.517708] env[59518]: DEBUG nova.network.neutron [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Successfully updated port: 8a7ecefa-7677-47e3-aa52-02d74d7b8a40 {{(pid=59518) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 610.534221] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Acquiring lock "refresh_cache-6ffd468f-b92f-45ae-834f-6daac20937ef" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 610.534371] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Acquired lock "refresh_cache-6ffd468f-b92f-45ae-834f-6daac20937ef" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 610.534493] env[59518]: DEBUG nova.network.neutron [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 610.562876] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._sync_power_states {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 610.583356] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Getting list of instances from cluster (obj){ [ 610.583356] env[59518]: value = "domain-c8" [ 610.583356] env[59518]: _type = "ClusterComputeResource" [ 610.583356] env[59518]: } {{(pid=59518) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 610.584691] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4782a16-5851-489f-81fb-d22c5af02aa7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.598265] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Got total of 3 instances {{(pid=59518) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 610.598265] env[59518]: WARNING nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] While synchronizing instance power states, found 6 instances in the database and 3 instances on the hypervisor. [ 610.598265] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Triggering sync for uuid bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70 {{(pid=59518) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 610.598265] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Triggering sync for uuid 21f44d88-a868-4765-95f8-8dbe8eccef7a {{(pid=59518) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 610.598265] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Triggering sync for uuid b037b116-3b8c-4f10-990c-a855f96fa61c {{(pid=59518) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 610.598265] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Triggering sync for uuid 6ffd468f-b92f-45ae-834f-6daac20937ef {{(pid=59518) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 610.598265] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Triggering sync for uuid 37d863b9-bfcb-4d1f-b99b-832276bd640f {{(pid=59518) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 610.598635] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Triggering sync for uuid a894a8af-52b8-4b1c-a5ea-2469f06ea17a {{(pid=59518) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 610.598635] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.598635] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "21f44d88-a868-4765-95f8-8dbe8eccef7a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.598635] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "b037b116-3b8c-4f10-990c-a855f96fa61c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.598942] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "6ffd468f-b92f-45ae-834f-6daac20937ef" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.599173] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "37d863b9-bfcb-4d1f-b99b-832276bd640f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.599308] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "a894a8af-52b8-4b1c-a5ea-2469f06ea17a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.599482] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 610.599762] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Getting list of instances from cluster (obj){ [ 610.599762] env[59518]: value = "domain-c8" [ 610.599762] env[59518]: _type = "ClusterComputeResource" [ 610.599762] env[59518]: } {{(pid=59518) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 610.600714] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8a6aa53-9b05-4a4e-b88f-cbcab06b507f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.617635] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Got total of 3 instances {{(pid=59518) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 610.711061] env[59518]: DEBUG nova.network.neutron [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 611.013204] env[59518]: DEBUG nova.compute.manager [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Received event network-vif-plugged-b13fae5b-bd06-45f6-8618-191c68992381 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 611.013411] env[59518]: DEBUG oslo_concurrency.lockutils [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] Acquiring lock "bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.013615] env[59518]: DEBUG oslo_concurrency.lockutils [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] Lock "bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.013856] env[59518]: DEBUG oslo_concurrency.lockutils [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] Lock "bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.014097] env[59518]: DEBUG nova.compute.manager [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] No waiting events found dispatching network-vif-plugged-b13fae5b-bd06-45f6-8618-191c68992381 {{(pid=59518) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 611.014315] env[59518]: WARNING nova.compute.manager [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Received unexpected event network-vif-plugged-b13fae5b-bd06-45f6-8618-191c68992381 for instance with vm_state building and task_state spawning. [ 611.014559] env[59518]: DEBUG nova.compute.manager [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Received event network-changed-0bbc769a-e7ad-418c-93dc-6a5cd658e622 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 611.014752] env[59518]: DEBUG nova.compute.manager [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Refreshing instance network info cache due to event network-changed-0bbc769a-e7ad-418c-93dc-6a5cd658e622. {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 611.014945] env[59518]: DEBUG oslo_concurrency.lockutils [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] Acquiring lock "refresh_cache-21f44d88-a868-4765-95f8-8dbe8eccef7a" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 611.015069] env[59518]: DEBUG oslo_concurrency.lockutils [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] Acquired lock "refresh_cache-21f44d88-a868-4765-95f8-8dbe8eccef7a" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 611.015221] env[59518]: DEBUG nova.network.neutron [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Refreshing network info cache for port 0bbc769a-e7ad-418c-93dc-6a5cd658e622 {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 611.504099] env[59518]: DEBUG nova.network.neutron [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Successfully updated port: e2be4581-77a6-4a18-8394-62cc4710988c {{(pid=59518) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 611.512641] env[59518]: DEBUG oslo_concurrency.lockutils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquiring lock "refresh_cache-37d863b9-bfcb-4d1f-b99b-832276bd640f" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 611.512793] env[59518]: DEBUG oslo_concurrency.lockutils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquired lock "refresh_cache-37d863b9-bfcb-4d1f-b99b-832276bd640f" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 611.512906] env[59518]: DEBUG nova.network.neutron [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 611.802813] env[59518]: DEBUG nova.network.neutron [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Updating instance_info_cache with network_info: [{"id": "8a7ecefa-7677-47e3-aa52-02d74d7b8a40", "address": "fa:16:3e:be:21:f0", "network": {"id": "11d41703-6adf-472b-95de-b1a30a5442e6", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-158923891-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d18e261ab4ee4cc4a05a3c624639dbce", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0c293d47-74c0-49d7-a474-cdb643080f6f", "external-id": "nsx-vlan-transportzone-172", "segmentation_id": 172, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8a7ecefa-76", "ovs_interfaceid": "8a7ecefa-7677-47e3-aa52-02d74d7b8a40", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 611.822423] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Releasing lock "refresh_cache-6ffd468f-b92f-45ae-834f-6daac20937ef" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 611.822727] env[59518]: DEBUG nova.compute.manager [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Instance network_info: |[{"id": "8a7ecefa-7677-47e3-aa52-02d74d7b8a40", "address": "fa:16:3e:be:21:f0", "network": {"id": "11d41703-6adf-472b-95de-b1a30a5442e6", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-158923891-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d18e261ab4ee4cc4a05a3c624639dbce", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0c293d47-74c0-49d7-a474-cdb643080f6f", "external-id": "nsx-vlan-transportzone-172", "segmentation_id": 172, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8a7ecefa-76", "ovs_interfaceid": "8a7ecefa-7677-47e3-aa52-02d74d7b8a40", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 611.823098] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:be:21:f0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0c293d47-74c0-49d7-a474-cdb643080f6f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8a7ecefa-7677-47e3-aa52-02d74d7b8a40', 'vif_model': 'vmxnet3'}] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 611.839541] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Creating folder: Project (d18e261ab4ee4cc4a05a3c624639dbce). Parent ref: group-v88807. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 611.840321] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-be871292-02f2-466e-a868-415728dcb362 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.861915] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Created folder: Project (d18e261ab4ee4cc4a05a3c624639dbce) in parent group-v88807. [ 611.862157] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Creating folder: Instances. Parent ref: group-v88817. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 611.862369] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0774bbb5-5333-4a89-8660-21fed69577be {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.869813] env[59518]: DEBUG nova.network.neutron [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 611.873316] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Created folder: Instances in parent group-v88817. [ 611.873636] env[59518]: DEBUG oslo.service.loopingcall [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 611.873721] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 611.874070] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-46046f12-0783-46d4-99fa-8e361676ad4d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.908045] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 611.908045] env[59518]: value = "task-307914" [ 611.908045] env[59518]: _type = "Task" [ 611.908045] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 611.914872] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307914, 'name': CreateVM_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 612.101012] env[59518]: DEBUG nova.network.neutron [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Updated VIF entry in instance network info cache for port 0bbc769a-e7ad-418c-93dc-6a5cd658e622. {{(pid=59518) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 612.101364] env[59518]: DEBUG nova.network.neutron [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Updating instance_info_cache with network_info: [{"id": "0bbc769a-e7ad-418c-93dc-6a5cd658e622", "address": "fa:16:3e:31:b5:ee", "network": {"id": "d9098ce0-cf4b-4e0f-b9b7-b613f8d5b1d5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.82", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "33bab75ff2cf45ecb4ab54af3adf83ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2", "external-id": "nsx-vlan-transportzone-546", "segmentation_id": 546, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0bbc769a-e7", "ovs_interfaceid": "0bbc769a-e7ad-418c-93dc-6a5cd658e622", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 612.113907] env[59518]: DEBUG oslo_concurrency.lockutils [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] Releasing lock "refresh_cache-21f44d88-a868-4765-95f8-8dbe8eccef7a" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 612.114218] env[59518]: DEBUG nova.compute.manager [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Received event network-changed-b13fae5b-bd06-45f6-8618-191c68992381 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 612.114393] env[59518]: DEBUG nova.compute.manager [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Refreshing instance network info cache due to event network-changed-b13fae5b-bd06-45f6-8618-191c68992381. {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 612.114640] env[59518]: DEBUG oslo_concurrency.lockutils [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] Acquiring lock "refresh_cache-bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 612.114815] env[59518]: DEBUG oslo_concurrency.lockutils [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] Acquired lock "refresh_cache-bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 612.114985] env[59518]: DEBUG nova.network.neutron [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Refreshing network info cache for port b13fae5b-bd06-45f6-8618-191c68992381 {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 612.209566] env[59518]: DEBUG nova.network.neutron [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Successfully updated port: 40b02e48-3fdf-4f44-9c50-03f4844f6ce3 {{(pid=59518) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 612.221348] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Acquiring lock "refresh_cache-a894a8af-52b8-4b1c-a5ea-2469f06ea17a" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 612.221487] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Acquired lock "refresh_cache-a894a8af-52b8-4b1c-a5ea-2469f06ea17a" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 612.221629] env[59518]: DEBUG nova.network.neutron [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 612.366380] env[59518]: DEBUG nova.network.neutron [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 612.425886] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307914, 'name': CreateVM_Task, 'duration_secs': 0.335093} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 612.426079] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 612.426899] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 612.427063] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 612.427361] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 612.427604] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ad754e23-6410-4322-b512-ce50489cdf4f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.432813] env[59518]: DEBUG oslo_vmware.api [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Waiting for the task: (returnval){ [ 612.432813] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52877344-f62d-a1a5-dc9d-8d1a52d46462" [ 612.432813] env[59518]: _type = "Task" [ 612.432813] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 612.445092] env[59518]: DEBUG oslo_vmware.api [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52877344-f62d-a1a5-dc9d-8d1a52d46462, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 612.677102] env[59518]: DEBUG nova.network.neutron [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Updating instance_info_cache with network_info: [{"id": "e2be4581-77a6-4a18-8394-62cc4710988c", "address": "fa:16:3e:f6:55:c8", "network": {"id": "d9098ce0-cf4b-4e0f-b9b7-b613f8d5b1d5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.131", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "33bab75ff2cf45ecb4ab54af3adf83ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2", "external-id": "nsx-vlan-transportzone-546", "segmentation_id": 546, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape2be4581-77", "ovs_interfaceid": "e2be4581-77a6-4a18-8394-62cc4710988c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 612.690819] env[59518]: DEBUG oslo_concurrency.lockutils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Releasing lock "refresh_cache-37d863b9-bfcb-4d1f-b99b-832276bd640f" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 612.691185] env[59518]: DEBUG nova.compute.manager [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Instance network_info: |[{"id": "e2be4581-77a6-4a18-8394-62cc4710988c", "address": "fa:16:3e:f6:55:c8", "network": {"id": "d9098ce0-cf4b-4e0f-b9b7-b613f8d5b1d5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.131", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "33bab75ff2cf45ecb4ab54af3adf83ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2", "external-id": "nsx-vlan-transportzone-546", "segmentation_id": 546, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape2be4581-77", "ovs_interfaceid": "e2be4581-77a6-4a18-8394-62cc4710988c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 612.691569] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f6:55:c8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e2be4581-77a6-4a18-8394-62cc4710988c', 'vif_model': 'vmxnet3'}] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 612.699273] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Creating folder: Project (d3476ab8778c40218c4b2b54e1297f19). Parent ref: group-v88807. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 612.699881] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3a92f578-d15b-4686-9db9-67c6f8b28efb {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.715776] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Created folder: Project (d3476ab8778c40218c4b2b54e1297f19) in parent group-v88807. [ 612.716056] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Creating folder: Instances. Parent ref: group-v88820. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 612.716287] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1d83f80f-1e3e-4bd8-9496-b18491bd9bad {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.726781] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Created folder: Instances in parent group-v88820. [ 612.727072] env[59518]: DEBUG oslo.service.loopingcall [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 612.727294] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 612.727495] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ff6e531c-57f6-47bb-8b54-871c45dc282e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.751958] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 612.751958] env[59518]: value = "task-307917" [ 612.751958] env[59518]: _type = "Task" [ 612.751958] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 612.761054] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307917, 'name': CreateVM_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 612.946147] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 612.946425] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 612.946582] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 613.096131] env[59518]: DEBUG nova.network.neutron [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Updating instance_info_cache with network_info: [{"id": "40b02e48-3fdf-4f44-9c50-03f4844f6ce3", "address": "fa:16:3e:70:58:ff", "network": {"id": "d9098ce0-cf4b-4e0f-b9b7-b613f8d5b1d5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.195", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "33bab75ff2cf45ecb4ab54af3adf83ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2", "external-id": "nsx-vlan-transportzone-546", "segmentation_id": 546, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap40b02e48-3f", "ovs_interfaceid": "40b02e48-3fdf-4f44-9c50-03f4844f6ce3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 613.110286] env[59518]: DEBUG nova.network.neutron [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Updated VIF entry in instance network info cache for port b13fae5b-bd06-45f6-8618-191c68992381. {{(pid=59518) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 613.110675] env[59518]: DEBUG nova.network.neutron [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Updating instance_info_cache with network_info: [{"id": "b13fae5b-bd06-45f6-8618-191c68992381", "address": "fa:16:3e:56:61:a2", "network": {"id": "f82b2678-6625-457a-bc6c-6a4d9004014e", "bridge": "br-int", "label": "tempest-ImagesTestJSON-812614802-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e48ea78a3aa548d89738c45e00618d64", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e23c1d18-c841-49ea-95f3-df5ceac28afd", "external-id": "nsx-vlan-transportzone-774", "segmentation_id": 774, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb13fae5b-bd", "ovs_interfaceid": "b13fae5b-bd06-45f6-8618-191c68992381", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 613.112544] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Releasing lock "refresh_cache-a894a8af-52b8-4b1c-a5ea-2469f06ea17a" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 613.112794] env[59518]: DEBUG nova.compute.manager [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Instance network_info: |[{"id": "40b02e48-3fdf-4f44-9c50-03f4844f6ce3", "address": "fa:16:3e:70:58:ff", "network": {"id": "d9098ce0-cf4b-4e0f-b9b7-b613f8d5b1d5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.195", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "33bab75ff2cf45ecb4ab54af3adf83ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2", "external-id": "nsx-vlan-transportzone-546", "segmentation_id": 546, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap40b02e48-3f", "ovs_interfaceid": "40b02e48-3fdf-4f44-9c50-03f4844f6ce3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 613.113321] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:70:58:ff', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '40b02e48-3fdf-4f44-9c50-03f4844f6ce3', 'vif_model': 'vmxnet3'}] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 613.121331] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Creating folder: Project (e8058037c59b4f388e4cac07cdf8be1d). Parent ref: group-v88807. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 613.122421] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c22c1cb3-10ab-47c4-b63f-6a731d486064 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.125070] env[59518]: DEBUG oslo_concurrency.lockutils [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] Releasing lock "refresh_cache-bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 613.126536] env[59518]: DEBUG nova.compute.manager [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Received event network-vif-plugged-8a7ecefa-7677-47e3-aa52-02d74d7b8a40 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 613.126756] env[59518]: DEBUG oslo_concurrency.lockutils [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] Acquiring lock "6ffd468f-b92f-45ae-834f-6daac20937ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.126950] env[59518]: DEBUG oslo_concurrency.lockutils [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] Lock "6ffd468f-b92f-45ae-834f-6daac20937ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.127104] env[59518]: DEBUG oslo_concurrency.lockutils [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] Lock "6ffd468f-b92f-45ae-834f-6daac20937ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.127295] env[59518]: DEBUG nova.compute.manager [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] No waiting events found dispatching network-vif-plugged-8a7ecefa-7677-47e3-aa52-02d74d7b8a40 {{(pid=59518) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 613.127517] env[59518]: WARNING nova.compute.manager [req-cc9c8508-382b-42e3-8fa7-81fb2f9df311 req-efb3051f-07db-4e9b-aa62-514a6d0374e7 service nova] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Received unexpected event network-vif-plugged-8a7ecefa-7677-47e3-aa52-02d74d7b8a40 for instance with vm_state building and task_state spawning. [ 613.137152] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Created folder: Project (e8058037c59b4f388e4cac07cdf8be1d) in parent group-v88807. [ 613.137355] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Creating folder: Instances. Parent ref: group-v88823. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 613.139818] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4d5bccc4-e685-4f7f-b1d6-b2bcfa5be1bd {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.150092] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Created folder: Instances in parent group-v88823. [ 613.150183] env[59518]: DEBUG oslo.service.loopingcall [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 613.150345] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 613.150580] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8a874b79-9df3-45b4-bbac-962fc52922ba {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.177043] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 613.177043] env[59518]: value = "task-307920" [ 613.177043] env[59518]: _type = "Task" [ 613.177043] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 613.194989] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307920, 'name': CreateVM_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 613.261857] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307917, 'name': CreateVM_Task, 'duration_secs': 0.332888} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 613.262025] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 613.262683] env[59518]: DEBUG oslo_concurrency.lockutils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 613.262836] env[59518]: DEBUG oslo_concurrency.lockutils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 613.263139] env[59518]: DEBUG oslo_concurrency.lockutils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 613.263567] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0625ead1-a6f4-4c77-b0ae-7c35abf4554a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.270427] env[59518]: DEBUG oslo_vmware.api [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Waiting for the task: (returnval){ [ 613.270427] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52c820a4-75b4-4fb3-01d2-59433ed9b577" [ 613.270427] env[59518]: _type = "Task" [ 613.270427] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 613.278212] env[59518]: DEBUG oslo_vmware.api [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52c820a4-75b4-4fb3-01d2-59433ed9b577, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 613.324517] env[59518]: DEBUG nova.compute.manager [req-38fb058a-f03c-467f-b774-846482c389c7 req-aa0ac093-429c-49c6-bd80-eb776517f854 service nova] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Received event network-vif-plugged-e2be4581-77a6-4a18-8394-62cc4710988c {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 613.324517] env[59518]: DEBUG oslo_concurrency.lockutils [req-38fb058a-f03c-467f-b774-846482c389c7 req-aa0ac093-429c-49c6-bd80-eb776517f854 service nova] Acquiring lock "37d863b9-bfcb-4d1f-b99b-832276bd640f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.324517] env[59518]: DEBUG oslo_concurrency.lockutils [req-38fb058a-f03c-467f-b774-846482c389c7 req-aa0ac093-429c-49c6-bd80-eb776517f854 service nova] Lock "37d863b9-bfcb-4d1f-b99b-832276bd640f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.324517] env[59518]: DEBUG oslo_concurrency.lockutils [req-38fb058a-f03c-467f-b774-846482c389c7 req-aa0ac093-429c-49c6-bd80-eb776517f854 service nova] Lock "37d863b9-bfcb-4d1f-b99b-832276bd640f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.324745] env[59518]: DEBUG nova.compute.manager [req-38fb058a-f03c-467f-b774-846482c389c7 req-aa0ac093-429c-49c6-bd80-eb776517f854 service nova] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] No waiting events found dispatching network-vif-plugged-e2be4581-77a6-4a18-8394-62cc4710988c {{(pid=59518) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 613.324745] env[59518]: WARNING nova.compute.manager [req-38fb058a-f03c-467f-b774-846482c389c7 req-aa0ac093-429c-49c6-bd80-eb776517f854 service nova] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Received unexpected event network-vif-plugged-e2be4581-77a6-4a18-8394-62cc4710988c for instance with vm_state building and task_state spawning. [ 613.688259] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307920, 'name': CreateVM_Task, 'duration_secs': 0.295312} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 613.688572] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 613.689323] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 613.781283] env[59518]: DEBUG oslo_concurrency.lockutils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 613.781507] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 613.781713] env[59518]: DEBUG oslo_concurrency.lockutils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 613.781919] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 613.782249] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 613.782494] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3f93894d-511f-4f7f-abfe-adb4e5ef0324 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.788821] env[59518]: DEBUG oslo_vmware.api [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Waiting for the task: (returnval){ [ 613.788821] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]5287015b-8cfc-b74b-ad0a-f7ecf61dfe90" [ 613.788821] env[59518]: _type = "Task" [ 613.788821] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 613.795043] env[59518]: DEBUG oslo_vmware.api [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]5287015b-8cfc-b74b-ad0a-f7ecf61dfe90, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 614.297944] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 614.298220] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 614.298379] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 614.488513] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 614.489762] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 614.489762] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Starting heal instance info cache {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 614.489762] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Rebuilding the list of instances to heal {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 614.509015] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 614.509178] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 614.509296] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 614.509465] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 614.509805] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 614.509805] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 614.510068] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Didn't find any instances for network info cache update. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 614.510413] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 614.510689] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 614.510816] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 614.511275] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 614.511275] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 614.511387] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 614.511547] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59518) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 614.511680] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 614.527663] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 614.527860] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 614.528161] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 614.528655] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59518) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 614.529909] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d564f1e4-d5ce-43ba-962e-1531803ff8a7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.540201] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-718801a3-1888-4c9a-ae2c-6345945804d3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.557684] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6489aab4-6fd5-4499-abce-9e8ee72b71db {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.565025] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fbf0c16-8dba-4db1-8ef0-edde25da0b24 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.603145] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181804MB free_disk=174GB free_vcpus=48 pci_devices=None {{(pid=59518) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 614.603306] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 614.605452] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 614.670749] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 614.670749] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 21f44d88-a868-4765-95f8-8dbe8eccef7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 614.670749] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance b037b116-3b8c-4f10-990c-a855f96fa61c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 614.670931] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 6ffd468f-b92f-45ae-834f-6daac20937ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 614.670972] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 37d863b9-bfcb-4d1f-b99b-832276bd640f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 614.671090] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance a894a8af-52b8-4b1c-a5ea-2469f06ea17a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 614.671365] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 614.671427] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 614.789594] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f6f988a-36f2-4268-9161-2e0174c0dfde {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.797941] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf3b104c-8527-42ab-917b-65e7a814563f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.827665] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c52a2b3-ec05-4fa4-b8ea-4c780f7a4b44 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.836700] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ddf16c6-9d19-410a-aa91-2a554dcdb613 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.851299] env[59518]: DEBUG nova.compute.provider_tree [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 614.859016] env[59518]: DEBUG nova.scheduler.client.report [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 614.874890] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59518) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 614.875071] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.272s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 616.048020] env[59518]: DEBUG oslo_concurrency.lockutils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquiring lock "b4fb287e-7329-4002-911a-2d1eee138372" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 616.048306] env[59518]: DEBUG oslo_concurrency.lockutils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Lock "b4fb287e-7329-4002-911a-2d1eee138372" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 616.060366] env[59518]: DEBUG nova.compute.manager [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 616.128235] env[59518]: DEBUG oslo_concurrency.lockutils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 616.128482] env[59518]: DEBUG oslo_concurrency.lockutils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 616.130059] env[59518]: INFO nova.compute.claims [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 616.356479] env[59518]: DEBUG nova.compute.manager [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Received event network-changed-8a7ecefa-7677-47e3-aa52-02d74d7b8a40 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 616.356738] env[59518]: DEBUG nova.compute.manager [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Refreshing instance network info cache due to event network-changed-8a7ecefa-7677-47e3-aa52-02d74d7b8a40. {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 616.357019] env[59518]: DEBUG oslo_concurrency.lockutils [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] Acquiring lock "refresh_cache-6ffd468f-b92f-45ae-834f-6daac20937ef" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 616.357221] env[59518]: DEBUG oslo_concurrency.lockutils [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] Acquired lock "refresh_cache-6ffd468f-b92f-45ae-834f-6daac20937ef" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 616.357440] env[59518]: DEBUG nova.network.neutron [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Refreshing network info cache for port 8a7ecefa-7677-47e3-aa52-02d74d7b8a40 {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 616.387513] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea9e6387-21cb-4c05-983d-8cb4c5f789b8 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.398377] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4e4e144-b014-4df3-b727-e078870f5ba8 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.439965] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f443258c-85ef-4cad-9b31-7d1b3997968d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.448310] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86cf7db0-1604-4310-8e7e-503129535f52 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.464363] env[59518]: DEBUG nova.compute.provider_tree [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 616.474699] env[59518]: DEBUG nova.scheduler.client.report [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 616.489137] env[59518]: DEBUG oslo_concurrency.lockutils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.361s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 616.489665] env[59518]: DEBUG nova.compute.manager [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 616.537850] env[59518]: DEBUG nova.compute.utils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 616.547844] env[59518]: DEBUG nova.compute.manager [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Allocating IP information in the background. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 616.548223] env[59518]: DEBUG nova.network.neutron [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] allocate_for_instance() {{(pid=59518) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 616.552176] env[59518]: DEBUG nova.compute.manager [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 616.636879] env[59518]: DEBUG nova.compute.manager [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 616.661139] env[59518]: DEBUG nova.policy [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e76e6170983343eb98ce9b38f7160f5e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'de52be7e32e5496f8ee12e4750b3644d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59518) authorize /opt/stack/nova/nova/policy.py:203}} [ 616.663595] env[59518]: DEBUG nova.virt.hardware [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 616.663996] env[59518]: DEBUG nova.virt.hardware [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 616.664340] env[59518]: DEBUG nova.virt.hardware [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 616.664670] env[59518]: DEBUG nova.virt.hardware [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 616.665040] env[59518]: DEBUG nova.virt.hardware [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 616.665419] env[59518]: DEBUG nova.virt.hardware [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 616.665735] env[59518]: DEBUG nova.virt.hardware [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 616.669261] env[59518]: DEBUG nova.virt.hardware [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 616.669570] env[59518]: DEBUG nova.virt.hardware [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 616.669860] env[59518]: DEBUG nova.virt.hardware [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 616.670466] env[59518]: DEBUG nova.virt.hardware [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 616.672270] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02d2d536-9860-4b40-8593-73de750a52b2 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.687553] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbd3aa44-71c2-430f-8450-2b74477d0f9a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.778793] env[59518]: DEBUG nova.network.neutron [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Updated VIF entry in instance network info cache for port 8a7ecefa-7677-47e3-aa52-02d74d7b8a40. {{(pid=59518) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 617.779178] env[59518]: DEBUG nova.network.neutron [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Updating instance_info_cache with network_info: [{"id": "8a7ecefa-7677-47e3-aa52-02d74d7b8a40", "address": "fa:16:3e:be:21:f0", "network": {"id": "11d41703-6adf-472b-95de-b1a30a5442e6", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-158923891-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d18e261ab4ee4cc4a05a3c624639dbce", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0c293d47-74c0-49d7-a474-cdb643080f6f", "external-id": "nsx-vlan-transportzone-172", "segmentation_id": 172, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8a7ecefa-76", "ovs_interfaceid": "8a7ecefa-7677-47e3-aa52-02d74d7b8a40", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 617.794133] env[59518]: DEBUG oslo_concurrency.lockutils [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] Releasing lock "refresh_cache-6ffd468f-b92f-45ae-834f-6daac20937ef" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 617.794388] env[59518]: DEBUG nova.compute.manager [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Received event network-vif-plugged-40b02e48-3fdf-4f44-9c50-03f4844f6ce3 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 617.794580] env[59518]: DEBUG oslo_concurrency.lockutils [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] Acquiring lock "a894a8af-52b8-4b1c-a5ea-2469f06ea17a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.794776] env[59518]: DEBUG oslo_concurrency.lockutils [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] Lock "a894a8af-52b8-4b1c-a5ea-2469f06ea17a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.795213] env[59518]: DEBUG oslo_concurrency.lockutils [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] Lock "a894a8af-52b8-4b1c-a5ea-2469f06ea17a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.795213] env[59518]: DEBUG nova.compute.manager [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] No waiting events found dispatching network-vif-plugged-40b02e48-3fdf-4f44-9c50-03f4844f6ce3 {{(pid=59518) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 617.795569] env[59518]: WARNING nova.compute.manager [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Received unexpected event network-vif-plugged-40b02e48-3fdf-4f44-9c50-03f4844f6ce3 for instance with vm_state building and task_state spawning. [ 617.795652] env[59518]: DEBUG nova.compute.manager [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Received event network-changed-40b02e48-3fdf-4f44-9c50-03f4844f6ce3 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 617.795776] env[59518]: DEBUG nova.compute.manager [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Refreshing instance network info cache due to event network-changed-40b02e48-3fdf-4f44-9c50-03f4844f6ce3. {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 617.795963] env[59518]: DEBUG oslo_concurrency.lockutils [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] Acquiring lock "refresh_cache-a894a8af-52b8-4b1c-a5ea-2469f06ea17a" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 617.796112] env[59518]: DEBUG oslo_concurrency.lockutils [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] Acquired lock "refresh_cache-a894a8af-52b8-4b1c-a5ea-2469f06ea17a" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 617.796273] env[59518]: DEBUG nova.network.neutron [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Refreshing network info cache for port 40b02e48-3fdf-4f44-9c50-03f4844f6ce3 {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 617.919750] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "a85abea1-8e8d-4007-803d-e36fff55e587" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.919750] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "a85abea1-8e8d-4007-803d-e36fff55e587" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.942539] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 617.969227] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "eead3c41-1a63-48f7-941e-24470658ed13" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.969454] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "eead3c41-1a63-48f7-941e-24470658ed13" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.987533] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 617.998022] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "04a58b0b-dfd8-4227-9c10-a69225fa5a53" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.998283] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "04a58b0b-dfd8-4227-9c10-a69225fa5a53" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.007638] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 618.064144] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.064144] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.064144] env[59518]: INFO nova.compute.claims [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 618.103785] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.113166] env[59518]: DEBUG nova.compute.manager [req-0e7e8e97-9e89-484c-99dd-0f369dfc32aa req-8dbc835b-de04-4af4-9c72-f8695e980b8f service nova] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Received event network-changed-e2be4581-77a6-4a18-8394-62cc4710988c {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 618.113381] env[59518]: DEBUG nova.compute.manager [req-0e7e8e97-9e89-484c-99dd-0f369dfc32aa req-8dbc835b-de04-4af4-9c72-f8695e980b8f service nova] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Refreshing instance network info cache due to event network-changed-e2be4581-77a6-4a18-8394-62cc4710988c. {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 618.113913] env[59518]: DEBUG oslo_concurrency.lockutils [req-0e7e8e97-9e89-484c-99dd-0f369dfc32aa req-8dbc835b-de04-4af4-9c72-f8695e980b8f service nova] Acquiring lock "refresh_cache-37d863b9-bfcb-4d1f-b99b-832276bd640f" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 618.114052] env[59518]: DEBUG oslo_concurrency.lockutils [req-0e7e8e97-9e89-484c-99dd-0f369dfc32aa req-8dbc835b-de04-4af4-9c72-f8695e980b8f service nova] Acquired lock "refresh_cache-37d863b9-bfcb-4d1f-b99b-832276bd640f" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 618.114208] env[59518]: DEBUG nova.network.neutron [req-0e7e8e97-9e89-484c-99dd-0f369dfc32aa req-8dbc835b-de04-4af4-9c72-f8695e980b8f service nova] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Refreshing network info cache for port e2be4581-77a6-4a18-8394-62cc4710988c {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 618.124635] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.284242] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Acquiring lock "282b61db-76cd-44c3-b500-7a465e903c97" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.284463] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Lock "282b61db-76cd-44c3-b500-7a465e903c97" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.328572] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9102ccb5-2895-4143-b613-ce4fdbf5c92c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.337105] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0536b298-aa7c-4fa3-b412-ac90a1d8ced6 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.341573] env[59518]: DEBUG nova.network.neutron [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Successfully created port: 85370e19-d1df-4ff6-b1fb-5624ef63bd04 {{(pid=59518) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 618.371001] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2102d0f-76d6-42b7-ab21-c9bb7d314ad7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.379843] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-156dc5f1-5bc6-4c4a-afbb-e3f8721cece3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.396972] env[59518]: DEBUG nova.compute.provider_tree [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 618.405630] env[59518]: DEBUG nova.scheduler.client.report [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 618.420609] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.359s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.421262] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 618.423762] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.321s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.425113] env[59518]: INFO nova.compute.claims [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 618.459926] env[59518]: DEBUG nova.compute.utils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 618.460984] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Allocating IP information in the background. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 618.461156] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] allocate_for_instance() {{(pid=59518) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 618.476104] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 618.572177] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 618.603781] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 618.604044] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 618.604194] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 618.604364] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 618.604496] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 618.604633] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 618.604830] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 618.604978] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 618.605132] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 618.605283] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 618.605445] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 618.606343] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4214aa8d-a6fe-43fa-a266-11ae52cc76b7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.617838] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4805f53-3546-4b6c-8e30-289948446906 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.710673] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c3674a0-a2e0-4d48-a18d-66c12c6ca8cd {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.718249] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e810d8fb-2c11-4f25-94ca-3d656fe7035b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.748473] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef6bd047-2cd5-4f5a-92a9-0ad93ab88aff {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.761455] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa6b4c7d-c731-44c3-aa23-7c7be3d98402 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.777712] env[59518]: DEBUG nova.compute.provider_tree [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 618.780769] env[59518]: DEBUG nova.policy [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7b517c3c7a4349a5b8ec718f4d88ec18', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b463f53c58ef49fa918299d1ea0f0a87', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59518) authorize /opt/stack/nova/nova/policy.py:203}} [ 618.786244] env[59518]: DEBUG nova.scheduler.client.report [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 618.801007] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.377s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.801549] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 618.803901] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.679s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.805276] env[59518]: INFO nova.compute.claims [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 618.843005] env[59518]: DEBUG nova.compute.utils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 618.844383] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Allocating IP information in the background. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 618.844547] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] allocate_for_instance() {{(pid=59518) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 618.857625] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 618.946172] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 618.971886] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 618.972176] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 618.972333] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 618.972509] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 618.972643] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 618.972967] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 618.972967] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 618.973119] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 618.973271] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 618.973417] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 618.973574] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 618.974659] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-927e942a-5d94-475c-a7ba-1a97e280e98e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.993385] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d41b4c5-2913-4769-bb9b-09e70c21ce6a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.059966] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9383e548-d58b-41c3-b4e6-818812fa32cc {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.073103] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb881181-3c63-47cd-9495-7d04f69d6658 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.103233] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8057ef88-b995-4963-9847-31af090605bf {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.110794] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fac7ca8-61c3-4a87-b9fe-a610e0304315 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.123813] env[59518]: DEBUG nova.compute.provider_tree [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 619.133758] env[59518]: DEBUG nova.scheduler.client.report [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 619.151712] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.348s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 619.152311] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 619.189835] env[59518]: DEBUG nova.compute.utils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 619.190067] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Allocating IP information in the background. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 619.190234] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] allocate_for_instance() {{(pid=59518) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 619.198940] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 619.272207] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 619.296424] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 619.297259] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 619.297461] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 619.297705] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 619.297867] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 619.298042] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 619.298510] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 619.298711] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 619.298913] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 619.299201] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 619.299416] env[59518]: DEBUG nova.virt.hardware [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 619.300327] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-105ee324-8a7b-414e-af76-8b73a5fcca8b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.310743] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98457006-aef6-411a-b91e-2f5216c93467 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.350629] env[59518]: DEBUG nova.policy [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7b517c3c7a4349a5b8ec718f4d88ec18', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b463f53c58ef49fa918299d1ea0f0a87', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59518) authorize /opt/stack/nova/nova/policy.py:203}} [ 619.403832] env[59518]: DEBUG nova.network.neutron [req-0e7e8e97-9e89-484c-99dd-0f369dfc32aa req-8dbc835b-de04-4af4-9c72-f8695e980b8f service nova] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Updated VIF entry in instance network info cache for port e2be4581-77a6-4a18-8394-62cc4710988c. {{(pid=59518) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 619.404265] env[59518]: DEBUG nova.network.neutron [req-0e7e8e97-9e89-484c-99dd-0f369dfc32aa req-8dbc835b-de04-4af4-9c72-f8695e980b8f service nova] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Updating instance_info_cache with network_info: [{"id": "e2be4581-77a6-4a18-8394-62cc4710988c", "address": "fa:16:3e:f6:55:c8", "network": {"id": "d9098ce0-cf4b-4e0f-b9b7-b613f8d5b1d5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.131", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "33bab75ff2cf45ecb4ab54af3adf83ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2", "external-id": "nsx-vlan-transportzone-546", "segmentation_id": 546, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape2be4581-77", "ovs_interfaceid": "e2be4581-77a6-4a18-8394-62cc4710988c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 619.413781] env[59518]: DEBUG oslo_concurrency.lockutils [req-0e7e8e97-9e89-484c-99dd-0f369dfc32aa req-8dbc835b-de04-4af4-9c72-f8695e980b8f service nova] Releasing lock "refresh_cache-37d863b9-bfcb-4d1f-b99b-832276bd640f" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 619.508950] env[59518]: DEBUG nova.network.neutron [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Updated VIF entry in instance network info cache for port 40b02e48-3fdf-4f44-9c50-03f4844f6ce3. {{(pid=59518) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 619.509350] env[59518]: DEBUG nova.network.neutron [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Updating instance_info_cache with network_info: [{"id": "40b02e48-3fdf-4f44-9c50-03f4844f6ce3", "address": "fa:16:3e:70:58:ff", "network": {"id": "d9098ce0-cf4b-4e0f-b9b7-b613f8d5b1d5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.195", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "33bab75ff2cf45ecb4ab54af3adf83ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2", "external-id": "nsx-vlan-transportzone-546", "segmentation_id": 546, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap40b02e48-3f", "ovs_interfaceid": "40b02e48-3fdf-4f44-9c50-03f4844f6ce3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 619.518504] env[59518]: DEBUG oslo_concurrency.lockutils [req-0a8fe68f-fc1e-468e-8cf8-447afaf45f03 req-42891b65-4ffc-4b00-ac58-7817e2cbefc5 service nova] Releasing lock "refresh_cache-a894a8af-52b8-4b1c-a5ea-2469f06ea17a" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 619.531888] env[59518]: DEBUG nova.policy [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7b517c3c7a4349a5b8ec718f4d88ec18', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b463f53c58ef49fa918299d1ea0f0a87', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59518) authorize /opt/stack/nova/nova/policy.py:203}} [ 620.376895] env[59518]: DEBUG oslo_concurrency.lockutils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Acquiring lock "3e59b5d7-978d-405a-b68a-47ee03b9a713" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.377464] env[59518]: DEBUG oslo_concurrency.lockutils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Lock "3e59b5d7-978d-405a-b68a-47ee03b9a713" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.873606] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Successfully created port: 47153a07-1e57-43be-abf7-6351e2ac60fd {{(pid=59518) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 620.955462] env[59518]: DEBUG oslo_concurrency.lockutils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Acquiring lock "af4b8dd9-a05d-427e-a147-76c7cfec5862" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.955585] env[59518]: DEBUG oslo_concurrency.lockutils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Lock "af4b8dd9-a05d-427e-a147-76c7cfec5862" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 621.178306] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Successfully created port: 3e3039cc-afcc-476b-b77a-866edceab24a {{(pid=59518) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 621.621196] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Successfully created port: 10aa1ce6-0f4a-4338-96e1-d655b926cd94 {{(pid=59518) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 621.970178] env[59518]: DEBUG nova.network.neutron [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Successfully updated port: 85370e19-d1df-4ff6-b1fb-5624ef63bd04 {{(pid=59518) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 621.995464] env[59518]: DEBUG oslo_concurrency.lockutils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquiring lock "refresh_cache-b4fb287e-7329-4002-911a-2d1eee138372" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 621.995464] env[59518]: DEBUG oslo_concurrency.lockutils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquired lock "refresh_cache-b4fb287e-7329-4002-911a-2d1eee138372" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 621.995638] env[59518]: DEBUG nova.network.neutron [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 622.103315] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Acquiring lock "ae88d565-bbf5-4c29-aee9-364c23086de5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.103467] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Lock "ae88d565-bbf5-4c29-aee9-364c23086de5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.365040] env[59518]: DEBUG nova.network.neutron [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 623.245367] env[59518]: DEBUG oslo_concurrency.lockutils [None req-116c1cff-3a7b-4c00-a467-567411db418e tempest-ServerRescueNegativeTestJSON-1916113752 tempest-ServerRescueNegativeTestJSON-1916113752-project-member] Acquiring lock "ec34b663-788a-4d55-aca8-3e139b374f71" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.245367] env[59518]: DEBUG oslo_concurrency.lockutils [None req-116c1cff-3a7b-4c00-a467-567411db418e tempest-ServerRescueNegativeTestJSON-1916113752 tempest-ServerRescueNegativeTestJSON-1916113752-project-member] Lock "ec34b663-788a-4d55-aca8-3e139b374f71" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.380256] env[59518]: DEBUG nova.network.neutron [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Updating instance_info_cache with network_info: [{"id": "85370e19-d1df-4ff6-b1fb-5624ef63bd04", "address": "fa:16:3e:28:08:d5", "network": {"id": "2a11bd53-b61c-45e8-bbcb-745d685ad1b2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1418418357-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "de52be7e32e5496f8ee12e4750b3644d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae4e3171-21cd-4094-b6cf-81bf366c75bd", "external-id": "nsx-vlan-transportzone-193", "segmentation_id": 193, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap85370e19-d1", "ovs_interfaceid": "85370e19-d1df-4ff6-b1fb-5624ef63bd04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 623.391520] env[59518]: DEBUG oslo_concurrency.lockutils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Releasing lock "refresh_cache-b4fb287e-7329-4002-911a-2d1eee138372" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 623.391827] env[59518]: DEBUG nova.compute.manager [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Instance network_info: |[{"id": "85370e19-d1df-4ff6-b1fb-5624ef63bd04", "address": "fa:16:3e:28:08:d5", "network": {"id": "2a11bd53-b61c-45e8-bbcb-745d685ad1b2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1418418357-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "de52be7e32e5496f8ee12e4750b3644d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae4e3171-21cd-4094-b6cf-81bf366c75bd", "external-id": "nsx-vlan-transportzone-193", "segmentation_id": 193, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap85370e19-d1", "ovs_interfaceid": "85370e19-d1df-4ff6-b1fb-5624ef63bd04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 623.392277] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:28:08:d5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ae4e3171-21cd-4094-b6cf-81bf366c75bd', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '85370e19-d1df-4ff6-b1fb-5624ef63bd04', 'vif_model': 'vmxnet3'}] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 623.399677] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Creating folder: Project (de52be7e32e5496f8ee12e4750b3644d). Parent ref: group-v88807. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 623.400668] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6432f963-6231-4884-89a3-4a97f710f5eb {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.413292] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Created folder: Project (de52be7e32e5496f8ee12e4750b3644d) in parent group-v88807. [ 623.413292] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Creating folder: Instances. Parent ref: group-v88826. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 623.413292] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0d803aee-cb1a-4845-91e3-5882a58391b1 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.421207] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Created folder: Instances in parent group-v88826. [ 623.421434] env[59518]: DEBUG oslo.service.loopingcall [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 623.421614] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 623.421804] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2d720de7-6744-4f91-b12c-4494dd2d3b27 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.442613] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 623.442613] env[59518]: value = "task-307923" [ 623.442613] env[59518]: _type = "Task" [ 623.442613] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 623.447903] env[59518]: DEBUG nova.compute.manager [req-fd10a255-7ab9-4818-9bfe-e5dfb53c9620 req-aba28a74-86b4-4faf-9ec9-0a42c1f721f7 service nova] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Received event network-vif-plugged-85370e19-d1df-4ff6-b1fb-5624ef63bd04 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 623.448120] env[59518]: DEBUG oslo_concurrency.lockutils [req-fd10a255-7ab9-4818-9bfe-e5dfb53c9620 req-aba28a74-86b4-4faf-9ec9-0a42c1f721f7 service nova] Acquiring lock "b4fb287e-7329-4002-911a-2d1eee138372-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.448317] env[59518]: DEBUG oslo_concurrency.lockutils [req-fd10a255-7ab9-4818-9bfe-e5dfb53c9620 req-aba28a74-86b4-4faf-9ec9-0a42c1f721f7 service nova] Lock "b4fb287e-7329-4002-911a-2d1eee138372-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.448469] env[59518]: DEBUG oslo_concurrency.lockutils [req-fd10a255-7ab9-4818-9bfe-e5dfb53c9620 req-aba28a74-86b4-4faf-9ec9-0a42c1f721f7 service nova] Lock "b4fb287e-7329-4002-911a-2d1eee138372-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 623.448632] env[59518]: DEBUG nova.compute.manager [req-fd10a255-7ab9-4818-9bfe-e5dfb53c9620 req-aba28a74-86b4-4faf-9ec9-0a42c1f721f7 service nova] [instance: b4fb287e-7329-4002-911a-2d1eee138372] No waiting events found dispatching network-vif-plugged-85370e19-d1df-4ff6-b1fb-5624ef63bd04 {{(pid=59518) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 623.448769] env[59518]: WARNING nova.compute.manager [req-fd10a255-7ab9-4818-9bfe-e5dfb53c9620 req-aba28a74-86b4-4faf-9ec9-0a42c1f721f7 service nova] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Received unexpected event network-vif-plugged-85370e19-d1df-4ff6-b1fb-5624ef63bd04 for instance with vm_state building and task_state spawning. [ 623.452799] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307923, 'name': CreateVM_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 623.955791] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307923, 'name': CreateVM_Task} progress is 25%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 624.453621] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307923, 'name': CreateVM_Task} progress is 25%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 624.472632] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Successfully updated port: 3e3039cc-afcc-476b-b77a-866edceab24a {{(pid=59518) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 624.487015] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "refresh_cache-eead3c41-1a63-48f7-941e-24470658ed13" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 624.487148] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquired lock "refresh_cache-eead3c41-1a63-48f7-941e-24470658ed13" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 624.487288] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 624.537429] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 624.801919] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Successfully updated port: 47153a07-1e57-43be-abf7-6351e2ac60fd {{(pid=59518) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 624.810043] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "refresh_cache-a85abea1-8e8d-4007-803d-e36fff55e587" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 624.810222] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquired lock "refresh_cache-a85abea1-8e8d-4007-803d-e36fff55e587" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 624.810374] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 624.959277] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307923, 'name': CreateVM_Task, 'duration_secs': 1.338824} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 624.959277] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 624.959277] env[59518]: DEBUG oslo_concurrency.lockutils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 624.959277] env[59518]: DEBUG oslo_concurrency.lockutils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 624.959277] env[59518]: DEBUG oslo_concurrency.lockutils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 624.960152] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-60aa668f-bb8f-47af-93db-ea104d8aef8a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.966534] env[59518]: DEBUG oslo_vmware.api [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Waiting for the task: (returnval){ [ 624.966534] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]524f329d-e522-f6f0-3b45-326eaa11d9e4" [ 624.966534] env[59518]: _type = "Task" [ 624.966534] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 624.975872] env[59518]: DEBUG oslo_vmware.api [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]524f329d-e522-f6f0-3b45-326eaa11d9e4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 625.102759] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 625.171179] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Successfully updated port: 10aa1ce6-0f4a-4338-96e1-d655b926cd94 {{(pid=59518) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 625.181003] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "refresh_cache-04a58b0b-dfd8-4227-9c10-a69225fa5a53" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 625.181171] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquired lock "refresh_cache-04a58b0b-dfd8-4227-9c10-a69225fa5a53" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 625.181328] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 625.239492] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Updating instance_info_cache with network_info: [{"id": "3e3039cc-afcc-476b-b77a-866edceab24a", "address": "fa:16:3e:6c:ca:c7", "network": {"id": "79420454-ca4b-4308-9c1c-7f52cbb6ceda", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1264907137-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b463f53c58ef49fa918299d1ea0f0a87", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3e3039cc-af", "ovs_interfaceid": "3e3039cc-afcc-476b-b77a-866edceab24a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 625.252921] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Releasing lock "refresh_cache-eead3c41-1a63-48f7-941e-24470658ed13" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 625.253258] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Instance network_info: |[{"id": "3e3039cc-afcc-476b-b77a-866edceab24a", "address": "fa:16:3e:6c:ca:c7", "network": {"id": "79420454-ca4b-4308-9c1c-7f52cbb6ceda", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1264907137-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b463f53c58ef49fa918299d1ea0f0a87", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3e3039cc-af", "ovs_interfaceid": "3e3039cc-afcc-476b-b77a-866edceab24a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 625.253628] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6c:ca:c7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ad4fcde7-8926-402a-a9b7-4878d2bc1cf6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3e3039cc-afcc-476b-b77a-866edceab24a', 'vif_model': 'vmxnet3'}] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 625.269598] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Creating folder: Project (b463f53c58ef49fa918299d1ea0f0a87). Parent ref: group-v88807. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 625.270157] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ef9ab5b9-ac55-47ca-90ca-b210b6056b07 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.281044] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Created folder: Project (b463f53c58ef49fa918299d1ea0f0a87) in parent group-v88807. [ 625.281295] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Creating folder: Instances. Parent ref: group-v88829. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 625.282151] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 625.283971] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ed7adb35-fe84-4049-b435-bd88fe4feb51 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.312948] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Created folder: Instances in parent group-v88829. [ 625.313040] env[59518]: DEBUG oslo.service.loopingcall [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 625.313239] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 625.313446] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-67abdb03-dcd0-4667-9200-806966415807 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.340079] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 625.340079] env[59518]: value = "task-307926" [ 625.340079] env[59518]: _type = "Task" [ 625.340079] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 625.353995] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307926, 'name': CreateVM_Task} progress is 6%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 625.479645] env[59518]: DEBUG oslo_concurrency.lockutils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 625.479645] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 625.479645] env[59518]: DEBUG oslo_concurrency.lockutils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 625.501313] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Updating instance_info_cache with network_info: [{"id": "47153a07-1e57-43be-abf7-6351e2ac60fd", "address": "fa:16:3e:33:1d:d0", "network": {"id": "79420454-ca4b-4308-9c1c-7f52cbb6ceda", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1264907137-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b463f53c58ef49fa918299d1ea0f0a87", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap47153a07-1e", "ovs_interfaceid": "47153a07-1e57-43be-abf7-6351e2ac60fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 625.517154] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Releasing lock "refresh_cache-a85abea1-8e8d-4007-803d-e36fff55e587" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 625.517154] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Instance network_info: |[{"id": "47153a07-1e57-43be-abf7-6351e2ac60fd", "address": "fa:16:3e:33:1d:d0", "network": {"id": "79420454-ca4b-4308-9c1c-7f52cbb6ceda", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1264907137-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b463f53c58ef49fa918299d1ea0f0a87", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap47153a07-1e", "ovs_interfaceid": "47153a07-1e57-43be-abf7-6351e2ac60fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 625.517280] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:33:1d:d0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ad4fcde7-8926-402a-a9b7-4878d2bc1cf6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '47153a07-1e57-43be-abf7-6351e2ac60fd', 'vif_model': 'vmxnet3'}] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 625.523816] env[59518]: DEBUG oslo.service.loopingcall [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 625.524462] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 625.524749] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-37936182-f56a-420a-b1af-77102733a322 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.559328] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 625.559328] env[59518]: value = "task-307927" [ 625.559328] env[59518]: _type = "Task" [ 625.559328] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 625.571542] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307927, 'name': CreateVM_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 625.612560] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Updating instance_info_cache with network_info: [{"id": "10aa1ce6-0f4a-4338-96e1-d655b926cd94", "address": "fa:16:3e:5f:0a:84", "network": {"id": "79420454-ca4b-4308-9c1c-7f52cbb6ceda", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1264907137-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b463f53c58ef49fa918299d1ea0f0a87", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap10aa1ce6-0f", "ovs_interfaceid": "10aa1ce6-0f4a-4338-96e1-d655b926cd94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 625.633235] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Releasing lock "refresh_cache-04a58b0b-dfd8-4227-9c10-a69225fa5a53" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 625.633954] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Instance network_info: |[{"id": "10aa1ce6-0f4a-4338-96e1-d655b926cd94", "address": "fa:16:3e:5f:0a:84", "network": {"id": "79420454-ca4b-4308-9c1c-7f52cbb6ceda", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1264907137-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b463f53c58ef49fa918299d1ea0f0a87", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap10aa1ce6-0f", "ovs_interfaceid": "10aa1ce6-0f4a-4338-96e1-d655b926cd94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 625.635104] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5f:0a:84', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ad4fcde7-8926-402a-a9b7-4878d2bc1cf6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '10aa1ce6-0f4a-4338-96e1-d655b926cd94', 'vif_model': 'vmxnet3'}] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 625.653717] env[59518]: DEBUG oslo.service.loopingcall [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 625.653973] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 625.654194] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-040cea00-8cea-4e24-92fe-7fc26e4a4238 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.677364] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 625.677364] env[59518]: value = "task-307928" [ 625.677364] env[59518]: _type = "Task" [ 625.677364] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 625.686755] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307928, 'name': CreateVM_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 625.849082] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307926, 'name': CreateVM_Task, 'duration_secs': 0.313339} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 625.849270] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 625.850022] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 625.850220] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 625.850661] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 625.851110] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ed450e61-93be-4844-870d-aa4e2f95ae07 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.855750] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Waiting for the task: (returnval){ [ 625.855750] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52049217-0a0c-2599-32db-8b89c95ada4d" [ 625.855750] env[59518]: _type = "Task" [ 625.855750] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 625.865449] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52049217-0a0c-2599-32db-8b89c95ada4d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 626.070500] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307927, 'name': CreateVM_Task, 'duration_secs': 0.29675} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 626.070500] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 626.071025] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 626.188411] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307928, 'name': CreateVM_Task, 'duration_secs': 0.295523} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 626.188411] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 626.190149] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 626.366208] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 626.366552] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 626.366845] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 626.367149] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 626.367497] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 626.367793] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-eb03496f-efbf-4bf2-b9a9-9e1fd5718650 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.372642] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Waiting for the task: (returnval){ [ 626.372642] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52796c28-99c1-6298-1127-c2984afd0ec8" [ 626.372642] env[59518]: _type = "Task" [ 626.372642] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 626.383577] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52796c28-99c1-6298-1127-c2984afd0ec8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 626.863448] env[59518]: DEBUG nova.compute.manager [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Received event network-changed-85370e19-d1df-4ff6-b1fb-5624ef63bd04 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 626.863696] env[59518]: DEBUG nova.compute.manager [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Refreshing instance network info cache due to event network-changed-85370e19-d1df-4ff6-b1fb-5624ef63bd04. {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 626.863887] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Acquiring lock "refresh_cache-b4fb287e-7329-4002-911a-2d1eee138372" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 626.864043] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Acquired lock "refresh_cache-b4fb287e-7329-4002-911a-2d1eee138372" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 626.864805] env[59518]: DEBUG nova.network.neutron [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Refreshing network info cache for port 85370e19-d1df-4ff6-b1fb-5624ef63bd04 {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 626.883442] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 626.883703] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 626.883999] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 626.886772] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 626.887347] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 626.887980] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6bfa9cf6-e49c-4ef1-9680-a96d7afc47a5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.894710] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Waiting for the task: (returnval){ [ 626.894710] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52f7a217-4995-5754-4289-e0e9396007bf" [ 626.894710] env[59518]: _type = "Task" [ 626.894710] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 626.906003] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52f7a217-4995-5754-4289-e0e9396007bf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 627.162797] env[59518]: DEBUG nova.network.neutron [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Updated VIF entry in instance network info cache for port 85370e19-d1df-4ff6-b1fb-5624ef63bd04. {{(pid=59518) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 627.163127] env[59518]: DEBUG nova.network.neutron [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Updating instance_info_cache with network_info: [{"id": "85370e19-d1df-4ff6-b1fb-5624ef63bd04", "address": "fa:16:3e:28:08:d5", "network": {"id": "2a11bd53-b61c-45e8-bbcb-745d685ad1b2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1418418357-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "de52be7e32e5496f8ee12e4750b3644d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae4e3171-21cd-4094-b6cf-81bf366c75bd", "external-id": "nsx-vlan-transportzone-193", "segmentation_id": 193, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap85370e19-d1", "ovs_interfaceid": "85370e19-d1df-4ff6-b1fb-5624ef63bd04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 627.172942] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Releasing lock "refresh_cache-b4fb287e-7329-4002-911a-2d1eee138372" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 627.173088] env[59518]: DEBUG nova.compute.manager [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Received event network-vif-plugged-3e3039cc-afcc-476b-b77a-866edceab24a {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 627.174018] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Acquiring lock "eead3c41-1a63-48f7-941e-24470658ed13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.174018] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Lock "eead3c41-1a63-48f7-941e-24470658ed13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.174018] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Lock "eead3c41-1a63-48f7-941e-24470658ed13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.174018] env[59518]: DEBUG nova.compute.manager [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: eead3c41-1a63-48f7-941e-24470658ed13] No waiting events found dispatching network-vif-plugged-3e3039cc-afcc-476b-b77a-866edceab24a {{(pid=59518) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 627.174410] env[59518]: WARNING nova.compute.manager [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Received unexpected event network-vif-plugged-3e3039cc-afcc-476b-b77a-866edceab24a for instance with vm_state building and task_state spawning. [ 627.174410] env[59518]: DEBUG nova.compute.manager [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Received event network-vif-plugged-47153a07-1e57-43be-abf7-6351e2ac60fd {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 627.174410] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Acquiring lock "a85abea1-8e8d-4007-803d-e36fff55e587-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.174410] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Lock "a85abea1-8e8d-4007-803d-e36fff55e587-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.174588] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Lock "a85abea1-8e8d-4007-803d-e36fff55e587-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.174625] env[59518]: DEBUG nova.compute.manager [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] No waiting events found dispatching network-vif-plugged-47153a07-1e57-43be-abf7-6351e2ac60fd {{(pid=59518) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 627.174770] env[59518]: WARNING nova.compute.manager [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Received unexpected event network-vif-plugged-47153a07-1e57-43be-abf7-6351e2ac60fd for instance with vm_state building and task_state spawning. [ 627.174944] env[59518]: DEBUG nova.compute.manager [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Received event network-changed-3e3039cc-afcc-476b-b77a-866edceab24a {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 627.175111] env[59518]: DEBUG nova.compute.manager [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Refreshing instance network info cache due to event network-changed-3e3039cc-afcc-476b-b77a-866edceab24a. {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 627.175280] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Acquiring lock "refresh_cache-eead3c41-1a63-48f7-941e-24470658ed13" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 627.175401] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Acquired lock "refresh_cache-eead3c41-1a63-48f7-941e-24470658ed13" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 627.175544] env[59518]: DEBUG nova.network.neutron [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Refreshing network info cache for port 3e3039cc-afcc-476b-b77a-866edceab24a {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 627.408298] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 627.408478] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 627.408685] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 627.572864] env[59518]: DEBUG nova.network.neutron [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Updated VIF entry in instance network info cache for port 3e3039cc-afcc-476b-b77a-866edceab24a. {{(pid=59518) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 627.572864] env[59518]: DEBUG nova.network.neutron [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Updating instance_info_cache with network_info: [{"id": "3e3039cc-afcc-476b-b77a-866edceab24a", "address": "fa:16:3e:6c:ca:c7", "network": {"id": "79420454-ca4b-4308-9c1c-7f52cbb6ceda", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1264907137-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b463f53c58ef49fa918299d1ea0f0a87", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3e3039cc-af", "ovs_interfaceid": "3e3039cc-afcc-476b-b77a-866edceab24a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 627.588976] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Releasing lock "refresh_cache-eead3c41-1a63-48f7-941e-24470658ed13" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 627.588976] env[59518]: DEBUG nova.compute.manager [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Received event network-vif-plugged-10aa1ce6-0f4a-4338-96e1-d655b926cd94 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 627.588976] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Acquiring lock "04a58b0b-dfd8-4227-9c10-a69225fa5a53-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.588976] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Lock "04a58b0b-dfd8-4227-9c10-a69225fa5a53-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.589256] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Lock "04a58b0b-dfd8-4227-9c10-a69225fa5a53-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.589256] env[59518]: DEBUG nova.compute.manager [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] No waiting events found dispatching network-vif-plugged-10aa1ce6-0f4a-4338-96e1-d655b926cd94 {{(pid=59518) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 627.589256] env[59518]: WARNING nova.compute.manager [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Received unexpected event network-vif-plugged-10aa1ce6-0f4a-4338-96e1-d655b926cd94 for instance with vm_state building and task_state spawning. [ 627.589256] env[59518]: DEBUG nova.compute.manager [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Received event network-changed-47153a07-1e57-43be-abf7-6351e2ac60fd {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 627.589394] env[59518]: DEBUG nova.compute.manager [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Refreshing instance network info cache due to event network-changed-47153a07-1e57-43be-abf7-6351e2ac60fd. {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 627.589394] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Acquiring lock "refresh_cache-a85abea1-8e8d-4007-803d-e36fff55e587" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 627.589394] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Acquired lock "refresh_cache-a85abea1-8e8d-4007-803d-e36fff55e587" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 627.589394] env[59518]: DEBUG nova.network.neutron [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Refreshing network info cache for port 47153a07-1e57-43be-abf7-6351e2ac60fd {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 628.071184] env[59518]: DEBUG nova.network.neutron [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Updated VIF entry in instance network info cache for port 47153a07-1e57-43be-abf7-6351e2ac60fd. {{(pid=59518) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 628.071662] env[59518]: DEBUG nova.network.neutron [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Updating instance_info_cache with network_info: [{"id": "47153a07-1e57-43be-abf7-6351e2ac60fd", "address": "fa:16:3e:33:1d:d0", "network": {"id": "79420454-ca4b-4308-9c1c-7f52cbb6ceda", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1264907137-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b463f53c58ef49fa918299d1ea0f0a87", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap47153a07-1e", "ovs_interfaceid": "47153a07-1e57-43be-abf7-6351e2ac60fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 628.088312] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Releasing lock "refresh_cache-a85abea1-8e8d-4007-803d-e36fff55e587" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 628.088312] env[59518]: DEBUG nova.compute.manager [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Received event network-changed-10aa1ce6-0f4a-4338-96e1-d655b926cd94 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 628.088312] env[59518]: DEBUG nova.compute.manager [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Refreshing instance network info cache due to event network-changed-10aa1ce6-0f4a-4338-96e1-d655b926cd94. {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 628.088312] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Acquiring lock "refresh_cache-04a58b0b-dfd8-4227-9c10-a69225fa5a53" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 628.088312] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Acquired lock "refresh_cache-04a58b0b-dfd8-4227-9c10-a69225fa5a53" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 628.088708] env[59518]: DEBUG nova.network.neutron [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Refreshing network info cache for port 10aa1ce6-0f4a-4338-96e1-d655b926cd94 {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 628.379186] env[59518]: DEBUG nova.network.neutron [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Updated VIF entry in instance network info cache for port 10aa1ce6-0f4a-4338-96e1-d655b926cd94. {{(pid=59518) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 628.379553] env[59518]: DEBUG nova.network.neutron [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Updating instance_info_cache with network_info: [{"id": "10aa1ce6-0f4a-4338-96e1-d655b926cd94", "address": "fa:16:3e:5f:0a:84", "network": {"id": "79420454-ca4b-4308-9c1c-7f52cbb6ceda", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1264907137-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b463f53c58ef49fa918299d1ea0f0a87", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad4fcde7-8926-402a-a9b7-4878d2bc1cf6", "external-id": "nsx-vlan-transportzone-840", "segmentation_id": 840, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap10aa1ce6-0f", "ovs_interfaceid": "10aa1ce6-0f4a-4338-96e1-d655b926cd94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 628.391363] env[59518]: DEBUG oslo_concurrency.lockutils [req-dbdf07ba-3edb-4a2f-9008-0d325add0956 req-60f70123-e808-4fe8-ae28-0e05282c29f9 service nova] Releasing lock "refresh_cache-04a58b0b-dfd8-4227-9c10-a69225fa5a53" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 652.076256] env[59518]: WARNING oslo_vmware.rw_handles [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 652.076256] env[59518]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 652.076256] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 652.076256] env[59518]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 652.076256] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 652.076256] env[59518]: ERROR oslo_vmware.rw_handles response.begin() [ 652.076256] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 652.076256] env[59518]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 652.076256] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 652.076256] env[59518]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 652.076256] env[59518]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 652.076256] env[59518]: ERROR oslo_vmware.rw_handles [ 652.076892] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Downloaded image file data e70539a9-144d-4900-807e-914ae0cc8539 to vmware_temp/32908a3e-d6cd-4028-8baf-7ef1b8ddd868/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 652.078057] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Caching image {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 652.078290] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Copying Virtual Disk [datastore1] vmware_temp/32908a3e-d6cd-4028-8baf-7ef1b8ddd868/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk to [datastore1] vmware_temp/32908a3e-d6cd-4028-8baf-7ef1b8ddd868/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk {{(pid=59518) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 652.078567] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c598a16a-1f21-4592-801c-ff3cb845629f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 652.087041] env[59518]: DEBUG oslo_vmware.api [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Waiting for the task: (returnval){ [ 652.087041] env[59518]: value = "task-307929" [ 652.087041] env[59518]: _type = "Task" [ 652.087041] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 652.096597] env[59518]: DEBUG oslo_vmware.api [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Task: {'id': task-307929, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 652.597446] env[59518]: DEBUG oslo_vmware.exceptions [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Fault InvalidArgument not matched. {{(pid=59518) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 652.597682] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 652.601039] env[59518]: ERROR nova.compute.manager [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 652.601039] env[59518]: Faults: ['InvalidArgument'] [ 652.601039] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Traceback (most recent call last): [ 652.601039] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 652.601039] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] yield resources [ 652.601039] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 652.601039] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] self.driver.spawn(context, instance, image_meta, [ 652.601039] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 652.601039] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 652.601039] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 652.601039] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] self._fetch_image_if_missing(context, vi) [ 652.601039] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 652.601522] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] image_cache(vi, tmp_image_ds_loc) [ 652.601522] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 652.601522] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] vm_util.copy_virtual_disk( [ 652.601522] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 652.601522] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] session._wait_for_task(vmdk_copy_task) [ 652.601522] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 652.601522] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] return self.wait_for_task(task_ref) [ 652.601522] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 652.601522] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] return evt.wait() [ 652.601522] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 652.601522] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] result = hub.switch() [ 652.601522] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 652.601522] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] return self.greenlet.switch() [ 652.601935] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 652.601935] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] self.f(*self.args, **self.kw) [ 652.601935] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 652.601935] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] raise exceptions.translate_fault(task_info.error) [ 652.601935] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 652.601935] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Faults: ['InvalidArgument'] [ 652.601935] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] [ 652.601935] env[59518]: INFO nova.compute.manager [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Terminating instance [ 652.602995] env[59518]: DEBUG oslo_concurrency.lockutils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 652.603103] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 652.603625] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Acquiring lock "refresh_cache-b037b116-3b8c-4f10-990c-a855f96fa61c" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 652.603772] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Acquired lock "refresh_cache-b037b116-3b8c-4f10-990c-a855f96fa61c" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 652.603934] env[59518]: DEBUG nova.network.neutron [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 652.604843] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b0a0052d-6284-4d29-b5c8-646c21ff897f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 652.615619] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 652.615619] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 652.616634] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0d9185cc-9c2b-4e65-b8dc-b40e9347e54e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 652.623659] env[59518]: DEBUG oslo_vmware.api [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Waiting for the task: (returnval){ [ 652.623659] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52805933-595e-63f8-f5bb-c02281c88e53" [ 652.623659] env[59518]: _type = "Task" [ 652.623659] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 652.634250] env[59518]: DEBUG oslo_vmware.api [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52805933-595e-63f8-f5bb-c02281c88e53, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 652.708573] env[59518]: DEBUG nova.network.neutron [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 653.136231] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 653.136619] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Creating directory with path [datastore1] vmware_temp/72e0e323-6c59-4db2-be98-c64fccd911db/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 653.136975] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0cfec594-ccc7-4cc0-8309-f50efb6433d7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 653.158174] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Created directory with path [datastore1] vmware_temp/72e0e323-6c59-4db2-be98-c64fccd911db/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 653.158388] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Fetch image to [datastore1] vmware_temp/72e0e323-6c59-4db2-be98-c64fccd911db/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 653.158554] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/72e0e323-6c59-4db2-be98-c64fccd911db/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 653.159500] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4eac3da0-5926-4dd0-a1b4-ab8bb6dc5f5a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 653.169142] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83335430-e99c-4e62-a256-ee1d51d13e5d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 653.178665] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42c9a1cb-2cde-4bee-902a-3fe2da8fa0bf {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 653.213124] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd664297-ac22-4258-be68-6fbc84fb50f7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 653.219485] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c8c89bfa-c4df-4f03-b7d9-1ce772b82044 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 653.305824] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 653.358705] env[59518]: DEBUG nova.network.neutron [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 653.368662] env[59518]: DEBUG oslo_vmware.rw_handles [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/72e0e323-6c59-4db2-be98-c64fccd911db/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 653.369374] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Releasing lock "refresh_cache-b037b116-3b8c-4f10-990c-a855f96fa61c" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 653.369779] env[59518]: DEBUG nova.compute.manager [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 653.369979] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 653.433670] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34253ab6-d407-4440-aec9-cd404be1f949 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 653.439281] env[59518]: DEBUG oslo_vmware.rw_handles [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Completed reading data from the image iterator. {{(pid=59518) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 653.439463] env[59518]: DEBUG oslo_vmware.rw_handles [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/72e0e323-6c59-4db2-be98-c64fccd911db/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 653.446011] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 653.446011] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5c090a13-8892-46bb-b2e3-0aa24be2cc57 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 653.472813] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 653.475900] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 653.475900] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Deleting the datastore file [datastore1] b037b116-3b8c-4f10-990c-a855f96fa61c {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 653.475900] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-dc1308a8-bfb5-40a4-8b1a-bd841c1688ce {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 653.482785] env[59518]: DEBUG oslo_vmware.api [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Waiting for the task: (returnval){ [ 653.482785] env[59518]: value = "task-307931" [ 653.482785] env[59518]: _type = "Task" [ 653.482785] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 653.491397] env[59518]: DEBUG oslo_vmware.api [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Task: {'id': task-307931, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 653.993313] env[59518]: DEBUG oslo_vmware.api [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Task: {'id': task-307931, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.036716} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 653.993554] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 653.993731] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 653.993898] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 653.994305] env[59518]: INFO nova.compute.manager [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Took 0.62 seconds to destroy the instance on the hypervisor. [ 653.994546] env[59518]: DEBUG oslo.service.loopingcall [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 653.994744] env[59518]: DEBUG nova.compute.manager [-] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Skipping network deallocation for instance since networking was not requested. {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 653.997052] env[59518]: DEBUG nova.compute.claims [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 653.997278] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 653.997485] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 654.283198] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2c4bbc7-e2f1-4180-a6e8-91a4441c3f64 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.293946] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b0ade3f-b39f-4bb0-a965-b7c4e3827862 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.327781] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09cac442-761b-4e4f-83a3-e7a041a3901f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.335666] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b45e15b-6928-4f3d-8ef3-8403e897c187 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.354977] env[59518]: DEBUG nova.compute.provider_tree [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 654.366206] env[59518]: DEBUG nova.scheduler.client.report [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 654.405286] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.406s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 654.405286] env[59518]: ERROR nova.compute.manager [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 654.405286] env[59518]: Faults: ['InvalidArgument'] [ 654.405286] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Traceback (most recent call last): [ 654.405286] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 654.405286] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] self.driver.spawn(context, instance, image_meta, [ 654.405286] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 654.405286] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 654.405286] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 654.405286] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] self._fetch_image_if_missing(context, vi) [ 654.406064] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 654.406064] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] image_cache(vi, tmp_image_ds_loc) [ 654.406064] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 654.406064] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] vm_util.copy_virtual_disk( [ 654.406064] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 654.406064] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] session._wait_for_task(vmdk_copy_task) [ 654.406064] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 654.406064] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] return self.wait_for_task(task_ref) [ 654.406064] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 654.406064] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] return evt.wait() [ 654.406064] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 654.406064] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] result = hub.switch() [ 654.406064] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 654.406535] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] return self.greenlet.switch() [ 654.406535] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 654.406535] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] self.f(*self.args, **self.kw) [ 654.406535] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 654.406535] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] raise exceptions.translate_fault(task_info.error) [ 654.406535] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 654.406535] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Faults: ['InvalidArgument'] [ 654.406535] env[59518]: ERROR nova.compute.manager [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] [ 654.406535] env[59518]: DEBUG nova.compute.utils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] VimFaultException {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 654.413620] env[59518]: DEBUG nova.compute.manager [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Build of instance b037b116-3b8c-4f10-990c-a855f96fa61c was re-scheduled: A specified parameter was not correct: fileType [ 654.413620] env[59518]: Faults: ['InvalidArgument'] {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 654.413620] env[59518]: DEBUG nova.compute.manager [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 654.413620] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Acquiring lock "refresh_cache-b037b116-3b8c-4f10-990c-a855f96fa61c" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 654.413620] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Acquired lock "refresh_cache-b037b116-3b8c-4f10-990c-a855f96fa61c" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 654.414050] env[59518]: DEBUG nova.network.neutron [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 654.506290] env[59518]: DEBUG nova.network.neutron [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 654.942620] env[59518]: DEBUG nova.network.neutron [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 654.951113] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Releasing lock "refresh_cache-b037b116-3b8c-4f10-990c-a855f96fa61c" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 654.951365] env[59518]: DEBUG nova.compute.manager [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 654.951547] env[59518]: DEBUG nova.compute.manager [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] Skipping network deallocation for instance since networking was not requested. {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 655.057399] env[59518]: INFO nova.scheduler.client.report [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Deleted allocations for instance b037b116-3b8c-4f10-990c-a855f96fa61c [ 655.087711] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6000652d-26be-4c7e-9252-efcee19cd456 tempest-ServerDiagnosticsV248Test-109633817 tempest-ServerDiagnosticsV248Test-109633817-project-member] Lock "b037b116-3b8c-4f10-990c-a855f96fa61c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 51.721s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 655.089220] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "b037b116-3b8c-4f10-990c-a855f96fa61c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 44.490s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 655.089411] env[59518]: INFO nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: b037b116-3b8c-4f10-990c-a855f96fa61c] During sync_power_state the instance has a pending task (spawning). Skip. [ 655.089591] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "b037b116-3b8c-4f10-990c-a855f96fa61c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 655.120620] env[59518]: DEBUG nova.compute.manager [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 655.194598] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 655.194598] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 655.194598] env[59518]: INFO nova.compute.claims [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 655.452138] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-075e5a7b-21c7-469a-8545-53aa4a61f8be {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.459685] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6d12926-ba8e-46de-a51b-a3ff1b4a554c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.498062] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b764747-d00a-4291-9612-f8b883a0fbdf {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.508182] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ce19d82-18d9-4c66-9b0c-aa3223e37ae8 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.528128] env[59518]: DEBUG nova.compute.provider_tree [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 655.539810] env[59518]: DEBUG nova.scheduler.client.report [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 655.557711] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.367s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 655.558222] env[59518]: DEBUG nova.compute.manager [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 655.599017] env[59518]: DEBUG nova.compute.utils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 655.599017] env[59518]: DEBUG nova.compute.manager [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Not allocating networking since 'none' was specified. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 655.628599] env[59518]: DEBUG nova.compute.manager [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 655.717812] env[59518]: DEBUG nova.compute.manager [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 655.738849] env[59518]: DEBUG nova.virt.hardware [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 655.738849] env[59518]: DEBUG nova.virt.hardware [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 655.739065] env[59518]: DEBUG nova.virt.hardware [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 655.739283] env[59518]: DEBUG nova.virt.hardware [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 655.739429] env[59518]: DEBUG nova.virt.hardware [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 655.739600] env[59518]: DEBUG nova.virt.hardware [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 655.739854] env[59518]: DEBUG nova.virt.hardware [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 655.740436] env[59518]: DEBUG nova.virt.hardware [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 655.740698] env[59518]: DEBUG nova.virt.hardware [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 655.740922] env[59518]: DEBUG nova.virt.hardware [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 655.741120] env[59518]: DEBUG nova.virt.hardware [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 655.742020] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-661c57f8-df81-4321-8876-f49459d90a49 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.750363] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d6f6a9e-3dbf-4d2e-b412-81abdb5fbd12 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.767130] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Instance VIF info [] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 655.773804] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Creating folder: Project (2ae568067b4940ffb1fe145e1e65e9ca). Parent ref: group-v88807. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 655.774279] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-46118787-d8fa-4424-8f50-b42a40c33537 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.784303] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Created folder: Project (2ae568067b4940ffb1fe145e1e65e9ca) in parent group-v88807. [ 655.785114] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Creating folder: Instances. Parent ref: group-v88834. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 655.785425] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-787157b2-9439-43e2-bbca-75fafac12fbd {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.793914] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Created folder: Instances in parent group-v88834. [ 655.794355] env[59518]: DEBUG oslo.service.loopingcall [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 655.794994] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 655.795623] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-843ee8b2-f242-443b-9d21-4cb0c397cf89 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.818381] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 655.818381] env[59518]: value = "task-307934" [ 655.818381] env[59518]: _type = "Task" [ 655.818381] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 655.826165] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307934, 'name': CreateVM_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 656.329241] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307934, 'name': CreateVM_Task, 'duration_secs': 0.455381} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 656.329241] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 656.329764] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 656.329764] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 656.329970] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 656.330172] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-16723af7-04ac-4e30-a469-6de569ebea9a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.334924] env[59518]: DEBUG oslo_vmware.api [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Waiting for the task: (returnval){ [ 656.334924] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52507909-4c1c-c82a-1ccf-d9df8f103b98" [ 656.334924] env[59518]: _type = "Task" [ 656.334924] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 656.346794] env[59518]: DEBUG oslo_vmware.api [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52507909-4c1c-c82a-1ccf-d9df8f103b98, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 656.847367] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 656.847367] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 656.847367] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 674.829539] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 674.860546] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 675.449652] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 675.449652] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 675.449652] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Starting heal instance info cache {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 675.449652] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Rebuilding the list of instances to heal {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 675.471950] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 675.471950] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 675.471950] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 675.471950] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 675.471950] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 675.472156] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 675.472156] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 675.472156] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 675.472156] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 675.472156] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 675.472287] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Didn't find any instances for network info cache update. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 675.472287] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 675.472287] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 675.472287] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 675.472287] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59518) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 676.447277] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 676.447554] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 676.447657] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 676.458983] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 676.458983] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.458983] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 676.459169] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59518) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 676.460140] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30fca568-510c-4046-a359-272c222365f7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 676.470003] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d3969fa-8f66-4e12-bcd1-d2b508b02799 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 676.485286] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a02d68b-4489-447d-a7f9-8ea65ba23eb3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 676.492821] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b35bcf6e-afd4-4373-a737-3c28640cba8a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 676.528954] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181704MB free_disk=174GB free_vcpus=48 pci_devices=None {{(pid=59518) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 676.529258] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 676.529537] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.601462] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 676.601614] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 21f44d88-a868-4765-95f8-8dbe8eccef7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 676.601737] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 6ffd468f-b92f-45ae-834f-6daac20937ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 676.601852] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 37d863b9-bfcb-4d1f-b99b-832276bd640f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 676.601963] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance a894a8af-52b8-4b1c-a5ea-2469f06ea17a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 676.602073] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance b4fb287e-7329-4002-911a-2d1eee138372 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 676.602182] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance a85abea1-8e8d-4007-803d-e36fff55e587 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 676.602287] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance eead3c41-1a63-48f7-941e-24470658ed13 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 676.602394] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 04a58b0b-dfd8-4227-9c10-a69225fa5a53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 676.602498] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 282b61db-76cd-44c3-b500-7a465e903c97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 676.628085] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 3e59b5d7-978d-405a-b68a-47ee03b9a713 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 676.658244] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance af4b8dd9-a05d-427e-a147-76c7cfec5862 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 676.669308] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance ae88d565-bbf5-4c29-aee9-364c23086de5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 676.669544] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 676.669685] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 676.875180] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41d1cb57-0459-42c7-8c66-2f5d9195a5b0 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 676.882911] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fdc18a8-024f-4b3c-bf8b-0c1fe695649e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 676.915044] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9cddb49e-f05b-44b4-b748-f762d0e55f61 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 676.922581] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f9e4df3-3bc0-412b-b57e-36f890a9dc0c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 676.936757] env[59518]: DEBUG nova.compute.provider_tree [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 676.946564] env[59518]: DEBUG nova.scheduler.client.report [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 676.963381] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59518) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 676.963563] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.434s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 684.532384] env[59518]: DEBUG oslo_concurrency.lockutils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Acquiring lock "ac3485ac-4817-4492-a196-331002b2cc46" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 684.532697] env[59518]: DEBUG oslo_concurrency.lockutils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Lock "ac3485ac-4817-4492-a196-331002b2cc46" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 694.128409] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c0b5d5aa-bf34-416c-98c9-a0b836996290 tempest-AttachVolumeShelveTestJSON-165117633 tempest-AttachVolumeShelveTestJSON-165117633-project-member] Acquiring lock "ad83492d-05a6-428d-b343-740c977105f8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.128409] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c0b5d5aa-bf34-416c-98c9-a0b836996290 tempest-AttachVolumeShelveTestJSON-165117633 tempest-AttachVolumeShelveTestJSON-165117633-project-member] Lock "ad83492d-05a6-428d-b343-740c977105f8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.994087] env[59518]: DEBUG oslo_concurrency.lockutils [None req-24c28a7d-c7e9-499f-81f5-2d33b6d14bf4 tempest-ServerRescueTestJSON-1806638401 tempest-ServerRescueTestJSON-1806638401-project-member] Acquiring lock "c4eb8b6b-c958-4d21-9e6c-c1ba6b62c619" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.994353] env[59518]: DEBUG oslo_concurrency.lockutils [None req-24c28a7d-c7e9-499f-81f5-2d33b6d14bf4 tempest-ServerRescueTestJSON-1806638401 tempest-ServerRescueTestJSON-1806638401-project-member] Lock "c4eb8b6b-c958-4d21-9e6c-c1ba6b62c619" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.067223] env[59518]: DEBUG oslo_concurrency.lockutils [None req-edbf377f-1109-47fb-8663-576991df6c29 tempest-ServersNegativeTestMultiTenantJSON-1906642749 tempest-ServersNegativeTestMultiTenantJSON-1906642749-project-member] Acquiring lock "2aaaecaa-86c6-4d8c-89a8-7d8e2405d294" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.067561] env[59518]: DEBUG oslo_concurrency.lockutils [None req-edbf377f-1109-47fb-8663-576991df6c29 tempest-ServersNegativeTestMultiTenantJSON-1906642749 tempest-ServersNegativeTestMultiTenantJSON-1906642749-project-member] Lock "2aaaecaa-86c6-4d8c-89a8-7d8e2405d294" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.457531] env[59518]: DEBUG oslo_concurrency.lockutils [None req-fe5a9076-47ad-49dd-9a3c-9ca5e96c3b51 tempest-ServerAddressesTestJSON-29904488 tempest-ServerAddressesTestJSON-29904488-project-member] Acquiring lock "3bdabf32-3735-4670-8591-fad410629d95" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.457782] env[59518]: DEBUG oslo_concurrency.lockutils [None req-fe5a9076-47ad-49dd-9a3c-9ca5e96c3b51 tempest-ServerAddressesTestJSON-29904488 tempest-ServerAddressesTestJSON-29904488-project-member] Lock "3bdabf32-3735-4670-8591-fad410629d95" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.994532] env[59518]: WARNING oslo_vmware.rw_handles [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 699.994532] env[59518]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 699.994532] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 699.994532] env[59518]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 699.994532] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 699.994532] env[59518]: ERROR oslo_vmware.rw_handles response.begin() [ 699.994532] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 699.994532] env[59518]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 699.994532] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 699.994532] env[59518]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 699.994532] env[59518]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 699.994532] env[59518]: ERROR oslo_vmware.rw_handles [ 699.994938] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Downloaded image file data e70539a9-144d-4900-807e-914ae0cc8539 to vmware_temp/72e0e323-6c59-4db2-be98-c64fccd911db/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 699.996224] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Caching image {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 699.996452] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Copying Virtual Disk [datastore1] vmware_temp/72e0e323-6c59-4db2-be98-c64fccd911db/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk to [datastore1] vmware_temp/72e0e323-6c59-4db2-be98-c64fccd911db/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk {{(pid=59518) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 699.996720] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7dc09e2e-9bba-49df-becb-289cec1ebb2b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.005504] env[59518]: DEBUG oslo_vmware.api [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Waiting for the task: (returnval){ [ 700.005504] env[59518]: value = "task-307949" [ 700.005504] env[59518]: _type = "Task" [ 700.005504] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 700.015307] env[59518]: DEBUG oslo_vmware.api [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Task: {'id': task-307949, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 700.516198] env[59518]: DEBUG oslo_vmware.exceptions [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Fault InvalidArgument not matched. {{(pid=59518) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 700.516460] env[59518]: DEBUG oslo_concurrency.lockutils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 700.516990] env[59518]: ERROR nova.compute.manager [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 700.516990] env[59518]: Faults: ['InvalidArgument'] [ 700.516990] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Traceback (most recent call last): [ 700.516990] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 700.516990] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] yield resources [ 700.516990] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 700.516990] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] self.driver.spawn(context, instance, image_meta, [ 700.516990] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 700.516990] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 700.516990] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 700.516990] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] self._fetch_image_if_missing(context, vi) [ 700.516990] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 700.517394] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] image_cache(vi, tmp_image_ds_loc) [ 700.517394] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 700.517394] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] vm_util.copy_virtual_disk( [ 700.517394] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 700.517394] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] session._wait_for_task(vmdk_copy_task) [ 700.517394] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 700.517394] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] return self.wait_for_task(task_ref) [ 700.517394] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 700.517394] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] return evt.wait() [ 700.517394] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 700.517394] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] result = hub.switch() [ 700.517394] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 700.517394] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] return self.greenlet.switch() [ 700.517787] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 700.517787] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] self.f(*self.args, **self.kw) [ 700.517787] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 700.517787] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] raise exceptions.translate_fault(task_info.error) [ 700.517787] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 700.517787] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Faults: ['InvalidArgument'] [ 700.517787] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] [ 700.517787] env[59518]: INFO nova.compute.manager [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Terminating instance [ 700.518889] env[59518]: DEBUG oslo_concurrency.lockutils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 700.519140] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 700.519420] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-50f73f0c-4011-4c01-ba07-3282fd2bab7b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.524535] env[59518]: DEBUG nova.compute.manager [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 700.524732] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 700.525522] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dca349e8-df61-43d7-ab0c-c39b7d42804c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.536120] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 700.536411] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 700.536593] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 700.537237] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-820aa8f1-c588-4c73-b04b-36b51c88110e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.538987] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-91971b76-b3ac-4796-a415-685e1fa6eae0 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.545257] env[59518]: DEBUG oslo_vmware.api [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Waiting for the task: (returnval){ [ 700.545257] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52da3843-a6be-2a72-8f42-63596a917c58" [ 700.545257] env[59518]: _type = "Task" [ 700.545257] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 700.554370] env[59518]: DEBUG oslo_vmware.api [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52da3843-a6be-2a72-8f42-63596a917c58, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 700.616373] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 700.616600] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 700.616769] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Deleting the datastore file [datastore1] 21f44d88-a868-4765-95f8-8dbe8eccef7a {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 700.617032] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6e97d835-c0c5-4a64-826b-b072c8b698a9 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.625452] env[59518]: DEBUG oslo_vmware.api [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Waiting for the task: (returnval){ [ 700.625452] env[59518]: value = "task-307952" [ 700.625452] env[59518]: _type = "Task" [ 700.625452] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 700.638834] env[59518]: DEBUG oslo_vmware.api [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Task: {'id': task-307952, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 701.057249] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 701.057533] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Creating directory with path [datastore1] vmware_temp/d84030d6-44c4-42a5-8590-f826480a6e80/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 701.064823] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f35969c5-a490-439a-a676-59a242abb01b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.078662] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Created directory with path [datastore1] vmware_temp/d84030d6-44c4-42a5-8590-f826480a6e80/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 701.078881] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Fetch image to [datastore1] vmware_temp/d84030d6-44c4-42a5-8590-f826480a6e80/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 701.079058] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/d84030d6-44c4-42a5-8590-f826480a6e80/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 701.079853] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eaa26c57-c0d8-4d62-8a89-7dde6ac2572c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.087688] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f4cbd7b-405a-4ffd-bde6-101d21c2d0b3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.098436] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec3e2ba0-d4c9-41fa-bdb5-b8fb8fa216b6 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.134097] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85ec81d4-a3a8-4902-bcc1-f90ea8b02b3a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.143535] env[59518]: DEBUG oslo_vmware.api [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Task: {'id': task-307952, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.095702} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 701.144098] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 701.144279] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 701.144440] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 701.144597] env[59518]: INFO nova.compute.manager [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Took 0.62 seconds to destroy the instance on the hypervisor. [ 701.146181] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0e867512-39af-45be-9b7a-182062604993 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.148293] env[59518]: DEBUG nova.compute.claims [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 701.148458] env[59518]: DEBUG oslo_concurrency.lockutils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 701.148656] env[59518]: DEBUG oslo_concurrency.lockutils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 701.172202] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 701.236483] env[59518]: DEBUG oslo_vmware.rw_handles [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d84030d6-44c4-42a5-8590-f826480a6e80/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 701.291362] env[59518]: DEBUG oslo_vmware.rw_handles [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Completed reading data from the image iterator. {{(pid=59518) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 701.294501] env[59518]: DEBUG oslo_vmware.rw_handles [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d84030d6-44c4-42a5-8590-f826480a6e80/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 701.549443] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a59fd214-7c01-4a8e-bf2b-9e9b8ba751a9 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.557796] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc014eda-cc15-4038-b27c-071b66a7bcc1 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.587676] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99a8debb-7b6a-465c-8199-4a9fa94054f6 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.595975] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1067f0bc-b674-416d-b7d1-f752b80802f5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.609855] env[59518]: DEBUG nova.compute.provider_tree [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 701.619154] env[59518]: DEBUG nova.scheduler.client.report [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 701.638037] env[59518]: DEBUG oslo_concurrency.lockutils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.489s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 701.638563] env[59518]: ERROR nova.compute.manager [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 701.638563] env[59518]: Faults: ['InvalidArgument'] [ 701.638563] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Traceback (most recent call last): [ 701.638563] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 701.638563] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] self.driver.spawn(context, instance, image_meta, [ 701.638563] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 701.638563] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 701.638563] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 701.638563] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] self._fetch_image_if_missing(context, vi) [ 701.638563] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 701.638563] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] image_cache(vi, tmp_image_ds_loc) [ 701.638563] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 701.638965] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] vm_util.copy_virtual_disk( [ 701.638965] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 701.638965] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] session._wait_for_task(vmdk_copy_task) [ 701.638965] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 701.638965] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] return self.wait_for_task(task_ref) [ 701.638965] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 701.638965] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] return evt.wait() [ 701.638965] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 701.638965] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] result = hub.switch() [ 701.638965] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 701.638965] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] return self.greenlet.switch() [ 701.638965] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 701.638965] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] self.f(*self.args, **self.kw) [ 701.639350] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 701.639350] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] raise exceptions.translate_fault(task_info.error) [ 701.639350] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 701.639350] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Faults: ['InvalidArgument'] [ 701.639350] env[59518]: ERROR nova.compute.manager [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] [ 701.639489] env[59518]: DEBUG nova.compute.utils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] VimFaultException {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 701.648516] env[59518]: DEBUG nova.compute.manager [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Build of instance 21f44d88-a868-4765-95f8-8dbe8eccef7a was re-scheduled: A specified parameter was not correct: fileType [ 701.648516] env[59518]: Faults: ['InvalidArgument'] {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 701.648516] env[59518]: DEBUG nova.compute.manager [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 701.648516] env[59518]: DEBUG nova.compute.manager [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 701.648516] env[59518]: DEBUG nova.compute.manager [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 701.648676] env[59518]: DEBUG nova.network.neutron [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 702.007297] env[59518]: DEBUG oslo_concurrency.lockutils [None req-fde14fbb-4e46-46c1-87d5-0dab64b86a3a tempest-ServerGroupTestJSON-923375532 tempest-ServerGroupTestJSON-923375532-project-member] Acquiring lock "68abcc97-7992-474e-873c-ce247f4f1bec" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 702.007767] env[59518]: DEBUG oslo_concurrency.lockutils [None req-fde14fbb-4e46-46c1-87d5-0dab64b86a3a tempest-ServerGroupTestJSON-923375532 tempest-ServerGroupTestJSON-923375532-project-member] Lock "68abcc97-7992-474e-873c-ce247f4f1bec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 702.184439] env[59518]: DEBUG nova.network.neutron [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 702.198234] env[59518]: INFO nova.compute.manager [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] Took 0.56 seconds to deallocate network for instance. [ 702.293450] env[59518]: INFO nova.scheduler.client.report [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Deleted allocations for instance 21f44d88-a868-4765-95f8-8dbe8eccef7a [ 702.313639] env[59518]: DEBUG oslo_concurrency.lockutils [None req-8cd86199-41c8-42b2-8ff7-625720d66d62 tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Lock "21f44d88-a868-4765-95f8-8dbe8eccef7a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 101.332s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 702.314889] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "21f44d88-a868-4765-95f8-8dbe8eccef7a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 91.716s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 702.315067] env[59518]: INFO nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 21f44d88-a868-4765-95f8-8dbe8eccef7a] During sync_power_state the instance has a pending task (spawning). Skip. [ 702.315284] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "21f44d88-a868-4765-95f8-8dbe8eccef7a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 702.358192] env[59518]: DEBUG nova.compute.manager [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 702.429616] env[59518]: DEBUG oslo_concurrency.lockutils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 702.429859] env[59518]: DEBUG oslo_concurrency.lockutils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 702.431547] env[59518]: INFO nova.compute.claims [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 702.516661] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b0269c41-6fb0-4392-b1b8-cc70fdaf886c tempest-ServersTestBootFromVolume-813210871 tempest-ServersTestBootFromVolume-813210871-project-member] Acquiring lock "00a7659b-41f1-4224-a111-01670979c415" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 702.516661] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b0269c41-6fb0-4392-b1b8-cc70fdaf886c tempest-ServersTestBootFromVolume-813210871 tempest-ServersTestBootFromVolume-813210871-project-member] Lock "00a7659b-41f1-4224-a111-01670979c415" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 702.735255] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19735e73-549d-4e9e-9b7e-c54d2691b4b4 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.743726] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3267130-6852-4657-8db1-179dae8d79bf {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.774484] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a630f59-19ad-4d6d-ab9c-11a32de2571d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.782839] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e07a6e8-311b-4941-9465-b0862bcc115f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.797478] env[59518]: DEBUG nova.compute.provider_tree [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 702.805817] env[59518]: DEBUG nova.scheduler.client.report [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 702.823122] env[59518]: DEBUG oslo_concurrency.lockutils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.393s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 702.823600] env[59518]: DEBUG nova.compute.manager [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 702.857951] env[59518]: DEBUG nova.compute.utils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 702.864343] env[59518]: DEBUG nova.compute.manager [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Allocating IP information in the background. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 702.864540] env[59518]: DEBUG nova.network.neutron [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] allocate_for_instance() {{(pid=59518) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 702.871166] env[59518]: DEBUG nova.compute.manager [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 702.914645] env[59518]: DEBUG nova.policy [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fccdd7de3a1d4db3beb9b23142fb421f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a29a7379624e4b8fb86b25509aae97e0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59518) authorize /opt/stack/nova/nova/policy.py:203}} [ 702.941241] env[59518]: DEBUG nova.compute.manager [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 702.962986] env[59518]: DEBUG nova.virt.hardware [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 702.963205] env[59518]: DEBUG nova.virt.hardware [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 702.963349] env[59518]: DEBUG nova.virt.hardware [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 702.963518] env[59518]: DEBUG nova.virt.hardware [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 702.963652] env[59518]: DEBUG nova.virt.hardware [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 702.963785] env[59518]: DEBUG nova.virt.hardware [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 702.963986] env[59518]: DEBUG nova.virt.hardware [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 702.964163] env[59518]: DEBUG nova.virt.hardware [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 702.964306] env[59518]: DEBUG nova.virt.hardware [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 702.964456] env[59518]: DEBUG nova.virt.hardware [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 702.964615] env[59518]: DEBUG nova.virt.hardware [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 702.965428] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10b96b4e-5622-4111-a063-9ba351642463 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.973629] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6db2b96-0949-4bea-ba6b-7969bf8aff0d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.280217] env[59518]: DEBUG nova.network.neutron [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Successfully created port: 3696b90d-bd4e-4908-894c-ac48bb0131dd {{(pid=59518) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 703.840598] env[59518]: DEBUG nova.network.neutron [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Successfully updated port: 3696b90d-bd4e-4908-894c-ac48bb0131dd {{(pid=59518) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 703.851263] env[59518]: DEBUG oslo_concurrency.lockutils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Acquiring lock "refresh_cache-3e59b5d7-978d-405a-b68a-47ee03b9a713" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 703.851263] env[59518]: DEBUG oslo_concurrency.lockutils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Acquired lock "refresh_cache-3e59b5d7-978d-405a-b68a-47ee03b9a713" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 703.851263] env[59518]: DEBUG nova.network.neutron [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 703.903116] env[59518]: DEBUG nova.network.neutron [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 704.065339] env[59518]: DEBUG nova.network.neutron [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Updating instance_info_cache with network_info: [{"id": "3696b90d-bd4e-4908-894c-ac48bb0131dd", "address": "fa:16:3e:a4:cf:7d", "network": {"id": "a22974c0-059e-463e-b50d-81172391a1e0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-120594682-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a29a7379624e4b8fb86b25509aae97e0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "352165bb-004f-4180-9627-3a275dbe18af", "external-id": "nsx-vlan-transportzone-926", "segmentation_id": 926, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3696b90d-bd", "ovs_interfaceid": "3696b90d-bd4e-4908-894c-ac48bb0131dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 704.075926] env[59518]: DEBUG oslo_concurrency.lockutils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Releasing lock "refresh_cache-3e59b5d7-978d-405a-b68a-47ee03b9a713" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 704.076257] env[59518]: DEBUG nova.compute.manager [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Instance network_info: |[{"id": "3696b90d-bd4e-4908-894c-ac48bb0131dd", "address": "fa:16:3e:a4:cf:7d", "network": {"id": "a22974c0-059e-463e-b50d-81172391a1e0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-120594682-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a29a7379624e4b8fb86b25509aae97e0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "352165bb-004f-4180-9627-3a275dbe18af", "external-id": "nsx-vlan-transportzone-926", "segmentation_id": 926, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3696b90d-bd", "ovs_interfaceid": "3696b90d-bd4e-4908-894c-ac48bb0131dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 704.076637] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a4:cf:7d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '352165bb-004f-4180-9627-3a275dbe18af', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3696b90d-bd4e-4908-894c-ac48bb0131dd', 'vif_model': 'vmxnet3'}] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 704.084342] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Creating folder: Project (a29a7379624e4b8fb86b25509aae97e0). Parent ref: group-v88807. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 704.084835] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-83847242-1b98-41b9-b926-aa7c22dd85b0 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.098489] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Created folder: Project (a29a7379624e4b8fb86b25509aae97e0) in parent group-v88807. [ 704.098695] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Creating folder: Instances. Parent ref: group-v88844. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 704.098955] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2072be1f-dc55-43ab-8eab-27bc0b03e959 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.110992] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Created folder: Instances in parent group-v88844. [ 704.111267] env[59518]: DEBUG oslo.service.loopingcall [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 704.111459] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 704.111656] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9b633012-b2a5-4694-b746-b36ff64a403a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.133669] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 704.133669] env[59518]: value = "task-307957" [ 704.133669] env[59518]: _type = "Task" [ 704.133669] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 704.148208] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307957, 'name': CreateVM_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 704.479498] env[59518]: DEBUG nova.compute.manager [req-14057a77-af49-4ecf-bb6f-d33d62332076 req-24628106-ed02-4110-9901-05c5b9f5862b service nova] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Received event network-vif-plugged-3696b90d-bd4e-4908-894c-ac48bb0131dd {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 704.480155] env[59518]: DEBUG oslo_concurrency.lockutils [req-14057a77-af49-4ecf-bb6f-d33d62332076 req-24628106-ed02-4110-9901-05c5b9f5862b service nova] Acquiring lock "3e59b5d7-978d-405a-b68a-47ee03b9a713-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.480155] env[59518]: DEBUG oslo_concurrency.lockutils [req-14057a77-af49-4ecf-bb6f-d33d62332076 req-24628106-ed02-4110-9901-05c5b9f5862b service nova] Lock "3e59b5d7-978d-405a-b68a-47ee03b9a713-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.480155] env[59518]: DEBUG oslo_concurrency.lockutils [req-14057a77-af49-4ecf-bb6f-d33d62332076 req-24628106-ed02-4110-9901-05c5b9f5862b service nova] Lock "3e59b5d7-978d-405a-b68a-47ee03b9a713-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.480329] env[59518]: DEBUG nova.compute.manager [req-14057a77-af49-4ecf-bb6f-d33d62332076 req-24628106-ed02-4110-9901-05c5b9f5862b service nova] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] No waiting events found dispatching network-vif-plugged-3696b90d-bd4e-4908-894c-ac48bb0131dd {{(pid=59518) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 704.480888] env[59518]: WARNING nova.compute.manager [req-14057a77-af49-4ecf-bb6f-d33d62332076 req-24628106-ed02-4110-9901-05c5b9f5862b service nova] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Received unexpected event network-vif-plugged-3696b90d-bd4e-4908-894c-ac48bb0131dd for instance with vm_state building and task_state spawning. [ 704.480888] env[59518]: DEBUG nova.compute.manager [req-14057a77-af49-4ecf-bb6f-d33d62332076 req-24628106-ed02-4110-9901-05c5b9f5862b service nova] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Received event network-changed-3696b90d-bd4e-4908-894c-ac48bb0131dd {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 704.480888] env[59518]: DEBUG nova.compute.manager [req-14057a77-af49-4ecf-bb6f-d33d62332076 req-24628106-ed02-4110-9901-05c5b9f5862b service nova] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Refreshing instance network info cache due to event network-changed-3696b90d-bd4e-4908-894c-ac48bb0131dd. {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 704.480888] env[59518]: DEBUG oslo_concurrency.lockutils [req-14057a77-af49-4ecf-bb6f-d33d62332076 req-24628106-ed02-4110-9901-05c5b9f5862b service nova] Acquiring lock "refresh_cache-3e59b5d7-978d-405a-b68a-47ee03b9a713" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 704.481308] env[59518]: DEBUG oslo_concurrency.lockutils [req-14057a77-af49-4ecf-bb6f-d33d62332076 req-24628106-ed02-4110-9901-05c5b9f5862b service nova] Acquired lock "refresh_cache-3e59b5d7-978d-405a-b68a-47ee03b9a713" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 704.481308] env[59518]: DEBUG nova.network.neutron [req-14057a77-af49-4ecf-bb6f-d33d62332076 req-24628106-ed02-4110-9901-05c5b9f5862b service nova] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Refreshing network info cache for port 3696b90d-bd4e-4908-894c-ac48bb0131dd {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 704.585917] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquiring lock "7d4fa130-c399-4e8c-a711-33a08ed5dde9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.586179] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Lock "7d4fa130-c399-4e8c-a711-33a08ed5dde9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.646612] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307957, 'name': CreateVM_Task, 'duration_secs': 0.332898} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 704.646782] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 704.647477] env[59518]: DEBUG oslo_concurrency.lockutils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 704.647646] env[59518]: DEBUG oslo_concurrency.lockutils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 704.647928] env[59518]: DEBUG oslo_concurrency.lockutils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 704.648178] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3fef0837-7d80-4811-a7fd-8ebe7c5b16af {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.653344] env[59518]: DEBUG oslo_vmware.api [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Waiting for the task: (returnval){ [ 704.653344] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]529a6b8d-a6ec-7ef6-2de4-da06df7c12fb" [ 704.653344] env[59518]: _type = "Task" [ 704.653344] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 704.661230] env[59518]: DEBUG oslo_vmware.api [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]529a6b8d-a6ec-7ef6-2de4-da06df7c12fb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 704.744564] env[59518]: DEBUG nova.network.neutron [req-14057a77-af49-4ecf-bb6f-d33d62332076 req-24628106-ed02-4110-9901-05c5b9f5862b service nova] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Updated VIF entry in instance network info cache for port 3696b90d-bd4e-4908-894c-ac48bb0131dd. {{(pid=59518) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 704.744914] env[59518]: DEBUG nova.network.neutron [req-14057a77-af49-4ecf-bb6f-d33d62332076 req-24628106-ed02-4110-9901-05c5b9f5862b service nova] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Updating instance_info_cache with network_info: [{"id": "3696b90d-bd4e-4908-894c-ac48bb0131dd", "address": "fa:16:3e:a4:cf:7d", "network": {"id": "a22974c0-059e-463e-b50d-81172391a1e0", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-120594682-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a29a7379624e4b8fb86b25509aae97e0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "352165bb-004f-4180-9627-3a275dbe18af", "external-id": "nsx-vlan-transportzone-926", "segmentation_id": 926, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3696b90d-bd", "ovs_interfaceid": "3696b90d-bd4e-4908-894c-ac48bb0131dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 704.754263] env[59518]: DEBUG oslo_concurrency.lockutils [req-14057a77-af49-4ecf-bb6f-d33d62332076 req-24628106-ed02-4110-9901-05c5b9f5862b service nova] Releasing lock "refresh_cache-3e59b5d7-978d-405a-b68a-47ee03b9a713" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 705.164285] env[59518]: DEBUG oslo_concurrency.lockutils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 705.164553] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 705.164724] env[59518]: DEBUG oslo_concurrency.lockutils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 716.038208] env[59518]: DEBUG oslo_concurrency.lockutils [None req-5ceb658d-bff5-410e-817f-25ac83db0abf tempest-ServerActionsV293TestJSON-855670052 tempest-ServerActionsV293TestJSON-855670052-project-member] Acquiring lock "2277f51e-d169-416c-b86b-fb8a019a309d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.039008] env[59518]: DEBUG oslo_concurrency.lockutils [None req-5ceb658d-bff5-410e-817f-25ac83db0abf tempest-ServerActionsV293TestJSON-855670052 tempest-ServerActionsV293TestJSON-855670052-project-member] Lock "2277f51e-d169-416c-b86b-fb8a019a309d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.965155] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 736.448542] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 736.448542] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 737.444234] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 737.448040] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 737.448040] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Starting heal instance info cache {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 737.448165] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Rebuilding the list of instances to heal {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 737.468999] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 737.469177] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 737.469307] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 737.469431] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 737.469553] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 737.469737] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 737.470001] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 737.470145] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 737.470263] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 737.470378] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 737.470494] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Didn't find any instances for network info cache update. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 737.471093] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 737.471256] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 737.471425] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 737.471629] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59518) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 738.448459] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 738.458729] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 738.459027] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 738.459112] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 738.459264] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59518) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 738.460421] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-750a841a-28b2-467e-9b7f-5972770f1026 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.469296] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83783798-7fe3-49fc-adf7-89a2f20dc027 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.484670] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bda1108a-127e-4dd8-ba1c-0f4e7c1db229 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.490895] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9001529d-43eb-4f5b-a4cf-b68278e168d2 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.519567] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181769MB free_disk=174GB free_vcpus=48 pci_devices=None {{(pid=59518) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 738.519654] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 738.519842] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 738.583885] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 738.584056] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 6ffd468f-b92f-45ae-834f-6daac20937ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 738.584182] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 37d863b9-bfcb-4d1f-b99b-832276bd640f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 738.584298] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance a894a8af-52b8-4b1c-a5ea-2469f06ea17a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 738.584419] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance b4fb287e-7329-4002-911a-2d1eee138372 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 738.584533] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance a85abea1-8e8d-4007-803d-e36fff55e587 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 738.584643] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance eead3c41-1a63-48f7-941e-24470658ed13 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 738.584751] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 04a58b0b-dfd8-4227-9c10-a69225fa5a53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 738.584857] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 282b61db-76cd-44c3-b500-7a465e903c97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 738.584964] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 3e59b5d7-978d-405a-b68a-47ee03b9a713 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 738.596550] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance af4b8dd9-a05d-427e-a147-76c7cfec5862 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 738.606186] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance ae88d565-bbf5-4c29-aee9-364c23086de5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 738.615346] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance ac3485ac-4817-4492-a196-331002b2cc46 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 738.625396] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance ad83492d-05a6-428d-b343-740c977105f8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 738.634592] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance c4eb8b6b-c958-4d21-9e6c-c1ba6b62c619 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 738.643942] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 2aaaecaa-86c6-4d8c-89a8-7d8e2405d294 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 738.653621] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 3bdabf32-3735-4670-8591-fad410629d95 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 738.662636] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 68abcc97-7992-474e-873c-ce247f4f1bec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 738.671514] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 00a7659b-41f1-4224-a111-01670979c415 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 738.680545] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 7d4fa130-c399-4e8c-a711-33a08ed5dde9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 738.688729] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 2277f51e-d169-416c-b86b-fb8a019a309d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 738.689003] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 738.689157] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 738.911594] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-891f53e1-4da5-4db0-a764-655d8a8e89ab {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.918769] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9812dd98-4b57-4ad2-bd79-caae704b168c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.948982] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a15dedfd-c4e9-43c1-b884-c1ccf18b5224 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.956190] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68080f14-11b3-4771-9ff6-da9a48ed6283 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.969745] env[59518]: DEBUG nova.compute.provider_tree [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 738.978240] env[59518]: DEBUG nova.scheduler.client.report [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 738.990951] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59518) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 738.991125] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.471s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 750.011003] env[59518]: WARNING oslo_vmware.rw_handles [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 750.011003] env[59518]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 750.011003] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 750.011003] env[59518]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 750.011003] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 750.011003] env[59518]: ERROR oslo_vmware.rw_handles response.begin() [ 750.011003] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 750.011003] env[59518]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 750.011003] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 750.011003] env[59518]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 750.011003] env[59518]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 750.011003] env[59518]: ERROR oslo_vmware.rw_handles [ 750.011587] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Downloaded image file data e70539a9-144d-4900-807e-914ae0cc8539 to vmware_temp/d84030d6-44c4-42a5-8590-f826480a6e80/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 750.013021] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Caching image {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 750.013300] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Copying Virtual Disk [datastore1] vmware_temp/d84030d6-44c4-42a5-8590-f826480a6e80/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk to [datastore1] vmware_temp/d84030d6-44c4-42a5-8590-f826480a6e80/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk {{(pid=59518) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 750.013601] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b62754c3-9848-4765-b056-402806b7e31e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.022418] env[59518]: DEBUG oslo_vmware.api [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Waiting for the task: (returnval){ [ 750.022418] env[59518]: value = "task-307962" [ 750.022418] env[59518]: _type = "Task" [ 750.022418] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 750.030838] env[59518]: DEBUG oslo_vmware.api [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Task: {'id': task-307962, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 750.532724] env[59518]: DEBUG oslo_vmware.exceptions [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Fault InvalidArgument not matched. {{(pid=59518) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 750.533130] env[59518]: DEBUG oslo_concurrency.lockutils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 750.533801] env[59518]: ERROR nova.compute.manager [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 750.533801] env[59518]: Faults: ['InvalidArgument'] [ 750.533801] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Traceback (most recent call last): [ 750.533801] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 750.533801] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] yield resources [ 750.533801] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 750.533801] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] self.driver.spawn(context, instance, image_meta, [ 750.533801] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 750.533801] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] self._vmops.spawn(context, instance, image_meta, injected_files, [ 750.533801] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 750.533801] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] self._fetch_image_if_missing(context, vi) [ 750.533801] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 750.533801] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] image_cache(vi, tmp_image_ds_loc) [ 750.534265] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 750.534265] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] vm_util.copy_virtual_disk( [ 750.534265] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 750.534265] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] session._wait_for_task(vmdk_copy_task) [ 750.534265] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 750.534265] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] return self.wait_for_task(task_ref) [ 750.534265] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 750.534265] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] return evt.wait() [ 750.534265] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 750.534265] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] result = hub.switch() [ 750.534265] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 750.534265] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] return self.greenlet.switch() [ 750.534265] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 750.534590] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] self.f(*self.args, **self.kw) [ 750.534590] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 750.534590] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] raise exceptions.translate_fault(task_info.error) [ 750.534590] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 750.534590] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Faults: ['InvalidArgument'] [ 750.534590] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] [ 750.534740] env[59518]: INFO nova.compute.manager [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Terminating instance [ 750.536331] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 750.536415] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 750.537021] env[59518]: DEBUG nova.compute.manager [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 750.537280] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 750.537496] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b9dc64d4-1c63-4457-ba5b-e69642bbc140 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.539902] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-230636b9-5379-4fcd-9791-61562d70fb38 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.546478] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 750.546684] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-47528e6e-959f-4ffb-bc0a-d36788e6346f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.549046] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 750.549218] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 750.549856] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c4296c13-8624-4a01-9e24-e691ba992672 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.554519] env[59518]: DEBUG oslo_vmware.api [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Waiting for the task: (returnval){ [ 750.554519] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52d8e409-ba77-c38e-f38c-c6bc3a013e76" [ 750.554519] env[59518]: _type = "Task" [ 750.554519] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 750.564609] env[59518]: DEBUG oslo_vmware.api [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52d8e409-ba77-c38e-f38c-c6bc3a013e76, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 750.637738] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 750.637952] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 750.638134] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Deleting the datastore file [datastore1] bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70 {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 750.638393] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b3426a2d-694b-4386-a816-e7360e6040c1 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.644748] env[59518]: DEBUG oslo_vmware.api [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Waiting for the task: (returnval){ [ 750.644748] env[59518]: value = "task-307964" [ 750.644748] env[59518]: _type = "Task" [ 750.644748] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 750.653251] env[59518]: DEBUG oslo_vmware.api [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Task: {'id': task-307964, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 751.065411] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 751.065705] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Creating directory with path [datastore1] vmware_temp/e92d4e86-684f-441d-81d4-eb3f3c696b83/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 751.065898] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3ad942b6-86a4-41f9-8773-afe1fd7d0c5d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.080136] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Created directory with path [datastore1] vmware_temp/e92d4e86-684f-441d-81d4-eb3f3c696b83/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 751.080257] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Fetch image to [datastore1] vmware_temp/e92d4e86-684f-441d-81d4-eb3f3c696b83/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 751.080391] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/e92d4e86-684f-441d-81d4-eb3f3c696b83/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 751.081185] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5257a711-75df-49a7-8c8f-161461c0e775 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.088748] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d4ea256-62c3-4ea4-bfa4-090bf6053c9c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.098523] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e90bb05b-68c9-4339-a6fe-9051461c56c5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.128492] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c34a6c7-137e-465d-8267-d7babda3d17b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.134405] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e8b842e5-8bf1-4d0b-8323-920badfefbbe {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.153229] env[59518]: DEBUG oslo_vmware.api [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Task: {'id': task-307964, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.1321} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 751.153457] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 751.153630] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 751.153791] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 751.153954] env[59518]: INFO nova.compute.manager [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Took 0.62 seconds to destroy the instance on the hypervisor. [ 751.156014] env[59518]: DEBUG nova.compute.claims [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 751.156184] env[59518]: DEBUG oslo_concurrency.lockutils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 751.156389] env[59518]: DEBUG oslo_concurrency.lockutils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 751.162940] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 751.212154] env[59518]: DEBUG oslo_vmware.rw_handles [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e92d4e86-684f-441d-81d4-eb3f3c696b83/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 751.269370] env[59518]: DEBUG oslo_vmware.rw_handles [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Completed reading data from the image iterator. {{(pid=59518) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 751.269541] env[59518]: DEBUG oslo_vmware.rw_handles [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e92d4e86-684f-441d-81d4-eb3f3c696b83/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 751.475603] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30cce1ab-e6e4-4aad-bd41-6bf9cc864b4d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.481643] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f252b9e2-90d6-4093-959e-923f92f37d1b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.511449] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b17ba9fd-be7f-4818-910b-22a906c5693e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.517990] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-672e4b4e-d5cb-4bf0-a0cf-66e656e9f18a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.530202] env[59518]: DEBUG nova.compute.provider_tree [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 751.538337] env[59518]: DEBUG nova.scheduler.client.report [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 751.554114] env[59518]: DEBUG oslo_concurrency.lockutils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.398s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 751.554626] env[59518]: ERROR nova.compute.manager [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 751.554626] env[59518]: Faults: ['InvalidArgument'] [ 751.554626] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Traceback (most recent call last): [ 751.554626] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 751.554626] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] self.driver.spawn(context, instance, image_meta, [ 751.554626] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 751.554626] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] self._vmops.spawn(context, instance, image_meta, injected_files, [ 751.554626] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 751.554626] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] self._fetch_image_if_missing(context, vi) [ 751.554626] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 751.554626] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] image_cache(vi, tmp_image_ds_loc) [ 751.554626] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 751.554955] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] vm_util.copy_virtual_disk( [ 751.554955] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 751.554955] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] session._wait_for_task(vmdk_copy_task) [ 751.554955] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 751.554955] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] return self.wait_for_task(task_ref) [ 751.554955] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 751.554955] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] return evt.wait() [ 751.554955] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 751.554955] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] result = hub.switch() [ 751.554955] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 751.554955] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] return self.greenlet.switch() [ 751.554955] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 751.554955] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] self.f(*self.args, **self.kw) [ 751.555260] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 751.555260] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] raise exceptions.translate_fault(task_info.error) [ 751.555260] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 751.555260] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Faults: ['InvalidArgument'] [ 751.555260] env[59518]: ERROR nova.compute.manager [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] [ 751.555385] env[59518]: DEBUG nova.compute.utils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] VimFaultException {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 751.556590] env[59518]: DEBUG nova.compute.manager [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Build of instance bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70 was re-scheduled: A specified parameter was not correct: fileType [ 751.556590] env[59518]: Faults: ['InvalidArgument'] {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 751.556946] env[59518]: DEBUG nova.compute.manager [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 751.557111] env[59518]: DEBUG nova.compute.manager [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 751.557272] env[59518]: DEBUG nova.compute.manager [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 751.557427] env[59518]: DEBUG nova.network.neutron [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 751.768587] env[59518]: DEBUG nova.network.neutron [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 751.779578] env[59518]: INFO nova.compute.manager [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] Took 0.22 seconds to deallocate network for instance. [ 751.868489] env[59518]: INFO nova.scheduler.client.report [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Deleted allocations for instance bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70 [ 751.884534] env[59518]: DEBUG oslo_concurrency.lockutils [None req-dff33d41-ce17-4764-86a6-7e27db52e435 tempest-ImagesTestJSON-583231582 tempest-ImagesTestJSON-583231582-project-member] Lock "bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 151.378s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 751.885534] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 141.287s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 751.885715] env[59518]: INFO nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70] During sync_power_state the instance has a pending task (spawning). Skip. [ 751.885877] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "bbb40cbb-49f1-4d5b-b3bd-8b15a9506a70" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 751.899545] env[59518]: DEBUG nova.compute.manager [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 751.948416] env[59518]: DEBUG oslo_concurrency.lockutils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 751.948665] env[59518]: DEBUG oslo_concurrency.lockutils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 751.950246] env[59518]: INFO nova.compute.claims [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 752.238634] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8672cc26-fa84-469b-a7d2-9e425c368515 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.246113] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e92499b-2d94-4606-9c18-ccc2136b6edc {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.276823] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b42ce42c-234b-4316-ba2b-ed4eace9d2de {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.283820] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1c09e75-8c2a-4eba-a212-7c9d3214eb71 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.296770] env[59518]: DEBUG nova.compute.provider_tree [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 752.306831] env[59518]: DEBUG nova.scheduler.client.report [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 752.320580] env[59518]: DEBUG oslo_concurrency.lockutils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.372s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 752.321084] env[59518]: DEBUG nova.compute.manager [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 752.355218] env[59518]: DEBUG nova.compute.utils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 752.356766] env[59518]: DEBUG nova.compute.manager [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Allocating IP information in the background. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 752.356932] env[59518]: DEBUG nova.network.neutron [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] allocate_for_instance() {{(pid=59518) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 752.366834] env[59518]: DEBUG nova.compute.manager [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 752.416749] env[59518]: DEBUG nova.policy [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '528412e14dd947649b38cc0e7fb925ba', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c2f64e1cb8f34bb3be2a43fadfa84711', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59518) authorize /opt/stack/nova/nova/policy.py:203}} [ 752.428541] env[59518]: DEBUG nova.compute.manager [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 752.449173] env[59518]: DEBUG nova.virt.hardware [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 752.449442] env[59518]: DEBUG nova.virt.hardware [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 752.449598] env[59518]: DEBUG nova.virt.hardware [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 752.449773] env[59518]: DEBUG nova.virt.hardware [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 752.449974] env[59518]: DEBUG nova.virt.hardware [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 752.450045] env[59518]: DEBUG nova.virt.hardware [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 752.450242] env[59518]: DEBUG nova.virt.hardware [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 752.450394] env[59518]: DEBUG nova.virt.hardware [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 752.450549] env[59518]: DEBUG nova.virt.hardware [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 752.450714] env[59518]: DEBUG nova.virt.hardware [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 752.450943] env[59518]: DEBUG nova.virt.hardware [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 752.451777] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5263c185-a968-45d1-bc84-99bab785c43d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.459403] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09051b42-db63-40fb-aeb6-35ac674c16e6 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.663172] env[59518]: DEBUG nova.network.neutron [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Successfully created port: daf5e5f5-fe62-4577-a751-7466e4a51af0 {{(pid=59518) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 753.171461] env[59518]: DEBUG nova.network.neutron [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Successfully updated port: daf5e5f5-fe62-4577-a751-7466e4a51af0 {{(pid=59518) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 753.181998] env[59518]: DEBUG oslo_concurrency.lockutils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Acquiring lock "refresh_cache-af4b8dd9-a05d-427e-a147-76c7cfec5862" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 753.182136] env[59518]: DEBUG oslo_concurrency.lockutils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Acquired lock "refresh_cache-af4b8dd9-a05d-427e-a147-76c7cfec5862" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 753.182283] env[59518]: DEBUG nova.network.neutron [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 753.215049] env[59518]: DEBUG nova.network.neutron [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 753.433192] env[59518]: DEBUG nova.network.neutron [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Updating instance_info_cache with network_info: [{"id": "daf5e5f5-fe62-4577-a751-7466e4a51af0", "address": "fa:16:3e:ad:80:93", "network": {"id": "ef32b5c4-c820-41e1-b466-7b7ea60dced0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-509293293-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c2f64e1cb8f34bb3be2a43fadfa84711", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "72781990-3cb3-42eb-9eb1-4040dedbf66f", "external-id": "cl2-zone-812", "segmentation_id": 812, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdaf5e5f5-fe", "ovs_interfaceid": "daf5e5f5-fe62-4577-a751-7466e4a51af0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 753.449184] env[59518]: DEBUG oslo_concurrency.lockutils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Releasing lock "refresh_cache-af4b8dd9-a05d-427e-a147-76c7cfec5862" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 753.449184] env[59518]: DEBUG nova.compute.manager [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Instance network_info: |[{"id": "daf5e5f5-fe62-4577-a751-7466e4a51af0", "address": "fa:16:3e:ad:80:93", "network": {"id": "ef32b5c4-c820-41e1-b466-7b7ea60dced0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-509293293-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c2f64e1cb8f34bb3be2a43fadfa84711", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "72781990-3cb3-42eb-9eb1-4040dedbf66f", "external-id": "cl2-zone-812", "segmentation_id": 812, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdaf5e5f5-fe", "ovs_interfaceid": "daf5e5f5-fe62-4577-a751-7466e4a51af0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 753.449300] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ad:80:93', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '72781990-3cb3-42eb-9eb1-4040dedbf66f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'daf5e5f5-fe62-4577-a751-7466e4a51af0', 'vif_model': 'vmxnet3'}] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 753.452123] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Creating folder: Project (c2f64e1cb8f34bb3be2a43fadfa84711). Parent ref: group-v88807. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 753.453176] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-757b5350-69ce-4351-9a5b-f8bba5814ac8 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.466343] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Created folder: Project (c2f64e1cb8f34bb3be2a43fadfa84711) in parent group-v88807. [ 753.466688] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Creating folder: Instances. Parent ref: group-v88848. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 753.467016] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c33a76e0-7a32-43ad-b657-02c8c456b969 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.478555] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Created folder: Instances in parent group-v88848. [ 753.478923] env[59518]: DEBUG oslo.service.loopingcall [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 753.479223] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 753.479548] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-81650717-7c96-4753-9fc9-f93c6696773a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.500064] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 753.500064] env[59518]: value = "task-307967" [ 753.500064] env[59518]: _type = "Task" [ 753.500064] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 753.508366] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307967, 'name': CreateVM_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 753.996229] env[59518]: DEBUG nova.compute.manager [req-4be7dc56-44c1-4d91-843f-da98460ef2bf req-83506537-8b36-45fa-9560-3a48df67c310 service nova] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Received event network-vif-plugged-daf5e5f5-fe62-4577-a751-7466e4a51af0 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 753.996464] env[59518]: DEBUG oslo_concurrency.lockutils [req-4be7dc56-44c1-4d91-843f-da98460ef2bf req-83506537-8b36-45fa-9560-3a48df67c310 service nova] Acquiring lock "af4b8dd9-a05d-427e-a147-76c7cfec5862-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 753.996627] env[59518]: DEBUG oslo_concurrency.lockutils [req-4be7dc56-44c1-4d91-843f-da98460ef2bf req-83506537-8b36-45fa-9560-3a48df67c310 service nova] Lock "af4b8dd9-a05d-427e-a147-76c7cfec5862-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 753.996784] env[59518]: DEBUG oslo_concurrency.lockutils [req-4be7dc56-44c1-4d91-843f-da98460ef2bf req-83506537-8b36-45fa-9560-3a48df67c310 service nova] Lock "af4b8dd9-a05d-427e-a147-76c7cfec5862-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 753.996940] env[59518]: DEBUG nova.compute.manager [req-4be7dc56-44c1-4d91-843f-da98460ef2bf req-83506537-8b36-45fa-9560-3a48df67c310 service nova] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] No waiting events found dispatching network-vif-plugged-daf5e5f5-fe62-4577-a751-7466e4a51af0 {{(pid=59518) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 753.997094] env[59518]: WARNING nova.compute.manager [req-4be7dc56-44c1-4d91-843f-da98460ef2bf req-83506537-8b36-45fa-9560-3a48df67c310 service nova] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Received unexpected event network-vif-plugged-daf5e5f5-fe62-4577-a751-7466e4a51af0 for instance with vm_state building and task_state spawning. [ 753.997239] env[59518]: DEBUG nova.compute.manager [req-4be7dc56-44c1-4d91-843f-da98460ef2bf req-83506537-8b36-45fa-9560-3a48df67c310 service nova] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Received event network-changed-daf5e5f5-fe62-4577-a751-7466e4a51af0 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 753.997381] env[59518]: DEBUG nova.compute.manager [req-4be7dc56-44c1-4d91-843f-da98460ef2bf req-83506537-8b36-45fa-9560-3a48df67c310 service nova] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Refreshing instance network info cache due to event network-changed-daf5e5f5-fe62-4577-a751-7466e4a51af0. {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 753.997647] env[59518]: DEBUG oslo_concurrency.lockutils [req-4be7dc56-44c1-4d91-843f-da98460ef2bf req-83506537-8b36-45fa-9560-3a48df67c310 service nova] Acquiring lock "refresh_cache-af4b8dd9-a05d-427e-a147-76c7cfec5862" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 753.997789] env[59518]: DEBUG oslo_concurrency.lockutils [req-4be7dc56-44c1-4d91-843f-da98460ef2bf req-83506537-8b36-45fa-9560-3a48df67c310 service nova] Acquired lock "refresh_cache-af4b8dd9-a05d-427e-a147-76c7cfec5862" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 753.997935] env[59518]: DEBUG nova.network.neutron [req-4be7dc56-44c1-4d91-843f-da98460ef2bf req-83506537-8b36-45fa-9560-3a48df67c310 service nova] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Refreshing network info cache for port daf5e5f5-fe62-4577-a751-7466e4a51af0 {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 754.011502] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307967, 'name': CreateVM_Task, 'duration_secs': 0.291359} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 754.011919] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 754.012547] env[59518]: DEBUG oslo_concurrency.lockutils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 754.012793] env[59518]: DEBUG oslo_concurrency.lockutils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 754.013088] env[59518]: DEBUG oslo_concurrency.lockutils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 754.013317] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-06642d8a-6270-4125-b106-4da1a03fb462 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.019035] env[59518]: DEBUG oslo_vmware.api [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Waiting for the task: (returnval){ [ 754.019035] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]5268f368-fca6-8507-f680-31f90f35bfe3" [ 754.019035] env[59518]: _type = "Task" [ 754.019035] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 754.027002] env[59518]: DEBUG oslo_vmware.api [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]5268f368-fca6-8507-f680-31f90f35bfe3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 754.232948] env[59518]: DEBUG nova.network.neutron [req-4be7dc56-44c1-4d91-843f-da98460ef2bf req-83506537-8b36-45fa-9560-3a48df67c310 service nova] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Updated VIF entry in instance network info cache for port daf5e5f5-fe62-4577-a751-7466e4a51af0. {{(pid=59518) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 754.233391] env[59518]: DEBUG nova.network.neutron [req-4be7dc56-44c1-4d91-843f-da98460ef2bf req-83506537-8b36-45fa-9560-3a48df67c310 service nova] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Updating instance_info_cache with network_info: [{"id": "daf5e5f5-fe62-4577-a751-7466e4a51af0", "address": "fa:16:3e:ad:80:93", "network": {"id": "ef32b5c4-c820-41e1-b466-7b7ea60dced0", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-509293293-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c2f64e1cb8f34bb3be2a43fadfa84711", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "72781990-3cb3-42eb-9eb1-4040dedbf66f", "external-id": "cl2-zone-812", "segmentation_id": 812, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdaf5e5f5-fe", "ovs_interfaceid": "daf5e5f5-fe62-4577-a751-7466e4a51af0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 754.243459] env[59518]: DEBUG oslo_concurrency.lockutils [req-4be7dc56-44c1-4d91-843f-da98460ef2bf req-83506537-8b36-45fa-9560-3a48df67c310 service nova] Releasing lock "refresh_cache-af4b8dd9-a05d-427e-a147-76c7cfec5862" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 754.528731] env[59518]: DEBUG oslo_concurrency.lockutils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 754.529039] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 754.529219] env[59518]: DEBUG oslo_concurrency.lockutils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 795.991399] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 796.447886] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 796.447886] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 797.125167] env[59518]: WARNING oslo_vmware.rw_handles [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 797.125167] env[59518]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 797.125167] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 797.125167] env[59518]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 797.125167] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 797.125167] env[59518]: ERROR oslo_vmware.rw_handles response.begin() [ 797.125167] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 797.125167] env[59518]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 797.125167] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 797.125167] env[59518]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 797.125167] env[59518]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 797.125167] env[59518]: ERROR oslo_vmware.rw_handles [ 797.125924] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Downloaded image file data e70539a9-144d-4900-807e-914ae0cc8539 to vmware_temp/e92d4e86-684f-441d-81d4-eb3f3c696b83/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 797.127213] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Caching image {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 797.127476] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Copying Virtual Disk [datastore1] vmware_temp/e92d4e86-684f-441d-81d4-eb3f3c696b83/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk to [datastore1] vmware_temp/e92d4e86-684f-441d-81d4-eb3f3c696b83/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk {{(pid=59518) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 797.127770] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-975908e0-759b-445d-b5f4-0cb9fc665cf6 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.136261] env[59518]: DEBUG oslo_vmware.api [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Waiting for the task: (returnval){ [ 797.136261] env[59518]: value = "task-307968" [ 797.136261] env[59518]: _type = "Task" [ 797.136261] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 797.143863] env[59518]: DEBUG oslo_vmware.api [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Task: {'id': task-307968, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 797.443184] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 797.443414] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 797.465689] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 797.465889] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Starting heal instance info cache {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 797.466024] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Rebuilding the list of instances to heal {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 797.488154] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 797.488244] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 797.488371] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 797.488490] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 797.488611] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 797.488727] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 797.488917] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 797.489071] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 797.489188] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 797.489300] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 797.489415] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Didn't find any instances for network info cache update. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 797.647572] env[59518]: DEBUG oslo_vmware.exceptions [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Fault InvalidArgument not matched. {{(pid=59518) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 797.647922] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 797.648660] env[59518]: ERROR nova.compute.manager [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 797.648660] env[59518]: Faults: ['InvalidArgument'] [ 797.648660] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Traceback (most recent call last): [ 797.648660] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 797.648660] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] yield resources [ 797.648660] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 797.648660] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] self.driver.spawn(context, instance, image_meta, [ 797.648660] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 797.648660] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] self._vmops.spawn(context, instance, image_meta, injected_files, [ 797.648660] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 797.648660] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] self._fetch_image_if_missing(context, vi) [ 797.648660] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 797.649052] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] image_cache(vi, tmp_image_ds_loc) [ 797.649052] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 797.649052] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] vm_util.copy_virtual_disk( [ 797.649052] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 797.649052] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] session._wait_for_task(vmdk_copy_task) [ 797.649052] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 797.649052] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] return self.wait_for_task(task_ref) [ 797.649052] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 797.649052] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] return evt.wait() [ 797.649052] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 797.649052] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] result = hub.switch() [ 797.649052] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 797.649052] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] return self.greenlet.switch() [ 797.649420] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 797.649420] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] self.f(*self.args, **self.kw) [ 797.649420] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 797.649420] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] raise exceptions.translate_fault(task_info.error) [ 797.649420] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 797.649420] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Faults: ['InvalidArgument'] [ 797.649420] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] [ 797.649420] env[59518]: INFO nova.compute.manager [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Terminating instance [ 797.651095] env[59518]: DEBUG oslo_concurrency.lockutils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 797.651360] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 797.651640] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-217715e9-caae-4221-b898-6bda6d999166 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.654128] env[59518]: DEBUG nova.compute.manager [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 797.654373] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 797.655106] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a14785a-6d30-4da9-8a63-9823c7fc1cbe {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.664715] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 797.665741] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6daefc25-fc1b-4594-98a1-7d15a501a1aa {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.667156] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 797.667322] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 797.667972] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-43100bc6-e6ec-41b9-8958-88d1ebc0974a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.673502] env[59518]: DEBUG oslo_vmware.api [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Waiting for the task: (returnval){ [ 797.673502] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52ab6e41-6961-489b-a878-76a614bb4d03" [ 797.673502] env[59518]: _type = "Task" [ 797.673502] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 797.680256] env[59518]: DEBUG oslo_vmware.api [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52ab6e41-6961-489b-a878-76a614bb4d03, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 797.737520] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 797.737741] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 797.737913] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Deleting the datastore file [datastore1] 6ffd468f-b92f-45ae-834f-6daac20937ef {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 797.738158] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-dc250195-1601-4233-b04d-5725642124b4 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.744398] env[59518]: DEBUG oslo_vmware.api [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Waiting for the task: (returnval){ [ 797.744398] env[59518]: value = "task-307970" [ 797.744398] env[59518]: _type = "Task" [ 797.744398] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 797.752779] env[59518]: DEBUG oslo_vmware.api [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Task: {'id': task-307970, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 798.184331] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 798.184638] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Creating directory with path [datastore1] vmware_temp/4d0e1f7e-9de8-467f-8d1f-887006e40103/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 798.184811] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3662970f-9cbe-41a4-b459-5c56430db85b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.197221] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Created directory with path [datastore1] vmware_temp/4d0e1f7e-9de8-467f-8d1f-887006e40103/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 798.197221] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Fetch image to [datastore1] vmware_temp/4d0e1f7e-9de8-467f-8d1f-887006e40103/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 798.197221] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/4d0e1f7e-9de8-467f-8d1f-887006e40103/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 798.197683] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56998d0f-45e5-43a4-b8cc-6594cb587851 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.204488] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f5f206a-142e-4446-9ad4-521da8f36039 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.213308] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5fe3e60-6d32-4a64-99c2-67e663f2ecba {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.242777] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6c581df-7083-49e7-8a11-1610081119cf {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.254837] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5602cb4c-aa34-4d15-8b8e-8e5bcd2d91af {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.256757] env[59518]: DEBUG oslo_vmware.api [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Task: {'id': task-307970, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071083} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 798.256990] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 798.257162] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 798.257321] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 798.257482] env[59518]: INFO nova.compute.manager [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Took 0.60 seconds to destroy the instance on the hypervisor. [ 798.259602] env[59518]: DEBUG nova.compute.claims [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 798.259789] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 798.260009] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 798.287915] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 798.339033] env[59518]: DEBUG oslo_vmware.rw_handles [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4d0e1f7e-9de8-467f-8d1f-887006e40103/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 798.394155] env[59518]: DEBUG oslo_vmware.rw_handles [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Completed reading data from the image iterator. {{(pid=59518) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 798.394341] env[59518]: DEBUG oslo_vmware.rw_handles [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4d0e1f7e-9de8-467f-8d1f-887006e40103/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 798.449145] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 798.449677] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 798.583838] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e37d392-f4a0-42dd-8c94-6caedd0259d0 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.592056] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fa69c7e-2660-48a7-9c87-1559016f8910 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.621527] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a574a838-f77c-4fea-ba4b-7b41871c1238 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.628804] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16a30df8-fbc0-4f9d-b03a-5a47d4ecf9a3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.641867] env[59518]: DEBUG nova.compute.provider_tree [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 798.650361] env[59518]: DEBUG nova.scheduler.client.report [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 798.667311] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.407s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 798.667867] env[59518]: ERROR nova.compute.manager [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 798.667867] env[59518]: Faults: ['InvalidArgument'] [ 798.667867] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Traceback (most recent call last): [ 798.667867] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 798.667867] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] self.driver.spawn(context, instance, image_meta, [ 798.667867] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 798.667867] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] self._vmops.spawn(context, instance, image_meta, injected_files, [ 798.667867] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 798.667867] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] self._fetch_image_if_missing(context, vi) [ 798.667867] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 798.667867] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] image_cache(vi, tmp_image_ds_loc) [ 798.667867] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 798.668276] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] vm_util.copy_virtual_disk( [ 798.668276] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 798.668276] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] session._wait_for_task(vmdk_copy_task) [ 798.668276] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 798.668276] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] return self.wait_for_task(task_ref) [ 798.668276] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 798.668276] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] return evt.wait() [ 798.668276] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 798.668276] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] result = hub.switch() [ 798.668276] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 798.668276] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] return self.greenlet.switch() [ 798.668276] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 798.668276] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] self.f(*self.args, **self.kw) [ 798.668620] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 798.668620] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] raise exceptions.translate_fault(task_info.error) [ 798.668620] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 798.668620] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Faults: ['InvalidArgument'] [ 798.668620] env[59518]: ERROR nova.compute.manager [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] [ 798.668620] env[59518]: DEBUG nova.compute.utils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] VimFaultException {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 798.669957] env[59518]: DEBUG nova.compute.manager [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Build of instance 6ffd468f-b92f-45ae-834f-6daac20937ef was re-scheduled: A specified parameter was not correct: fileType [ 798.669957] env[59518]: Faults: ['InvalidArgument'] {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 798.670312] env[59518]: DEBUG nova.compute.manager [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 798.670473] env[59518]: DEBUG nova.compute.manager [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 798.670633] env[59518]: DEBUG nova.compute.manager [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 798.670814] env[59518]: DEBUG nova.network.neutron [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 798.929232] env[59518]: DEBUG nova.network.neutron [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 798.939503] env[59518]: INFO nova.compute.manager [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] Took 0.27 seconds to deallocate network for instance. [ 799.023750] env[59518]: INFO nova.scheduler.client.report [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Deleted allocations for instance 6ffd468f-b92f-45ae-834f-6daac20937ef [ 799.041742] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b59632f5-915b-4283-b1b8-34a2dd9218ea tempest-FloatingIPsAssociationTestJSON-1764718150 tempest-FloatingIPsAssociationTestJSON-1764718150-project-member] Lock "6ffd468f-b92f-45ae-834f-6daac20937ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 194.654s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 799.042835] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "6ffd468f-b92f-45ae-834f-6daac20937ef" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 188.444s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 799.043011] env[59518]: INFO nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 6ffd468f-b92f-45ae-834f-6daac20937ef] During sync_power_state the instance has a pending task (spawning). Skip. [ 799.043174] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "6ffd468f-b92f-45ae-834f-6daac20937ef" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 799.058604] env[59518]: DEBUG nova.compute.manager [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 799.111241] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 799.111486] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 799.112886] env[59518]: INFO nova.compute.claims [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 799.362778] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61de9f8d-bd27-4b5c-903f-4035c8067af2 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.370167] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba7bdd3b-fdbb-4214-9bd5-b6f60f5c6295 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.398947] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7230381b-e434-410c-b6d6-9aaa55001b57 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.405933] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c52c867-de98-4f76-9cbc-97e29cc4cd1e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.418761] env[59518]: DEBUG nova.compute.provider_tree [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 799.427012] env[59518]: DEBUG nova.scheduler.client.report [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 799.439087] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.328s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 799.439528] env[59518]: DEBUG nova.compute.manager [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 799.447400] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 799.447533] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59518) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 799.469224] env[59518]: DEBUG nova.compute.utils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 799.470491] env[59518]: DEBUG nova.compute.manager [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Allocating IP information in the background. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 799.470645] env[59518]: DEBUG nova.network.neutron [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] allocate_for_instance() {{(pid=59518) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 799.479464] env[59518]: DEBUG nova.compute.manager [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 799.514055] env[59518]: DEBUG nova.policy [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '856494a8bda043d391596a2b516c166b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd4e10f0558b44bcd91b960e50278f961', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59518) authorize /opt/stack/nova/nova/policy.py:203}} [ 799.541788] env[59518]: DEBUG nova.compute.manager [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 799.560784] env[59518]: DEBUG nova.virt.hardware [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 799.561224] env[59518]: DEBUG nova.virt.hardware [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 799.561224] env[59518]: DEBUG nova.virt.hardware [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 799.561360] env[59518]: DEBUG nova.virt.hardware [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 799.561500] env[59518]: DEBUG nova.virt.hardware [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 799.561639] env[59518]: DEBUG nova.virt.hardware [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 799.561835] env[59518]: DEBUG nova.virt.hardware [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 799.562009] env[59518]: DEBUG nova.virt.hardware [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 799.562179] env[59518]: DEBUG nova.virt.hardware [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 799.562334] env[59518]: DEBUG nova.virt.hardware [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 799.562495] env[59518]: DEBUG nova.virt.hardware [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 799.563317] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbc67def-a088-42b2-adaa-3a475be24452 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.570962] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d46f3bc-06a9-4c9d-b0be-6a759f8428ba {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.782418] env[59518]: DEBUG nova.network.neutron [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Successfully created port: 29b60c4c-a991-41f5-9b79-066487dc7b6b {{(pid=59518) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 800.447581] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 800.457992] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 800.458210] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 800.458364] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 800.458512] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59518) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 800.459589] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9980ae09-1746-499c-9b88-e8c83fea5f1c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.471464] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0f82b71-43bc-4936-8116-b34ee00d4394 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.483583] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29dc2d35-dc9f-40a9-9711-b4037e81546e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.489916] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40533c06-dea2-4a90-abd5-220decf4dacf {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.518549] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181782MB free_disk=174GB free_vcpus=48 pci_devices=None {{(pid=59518) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 800.518748] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 800.518969] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 800.556684] env[59518]: DEBUG nova.network.neutron [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Successfully updated port: 29b60c4c-a991-41f5-9b79-066487dc7b6b {{(pid=59518) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 800.564983] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Acquiring lock "refresh_cache-ae88d565-bbf5-4c29-aee9-364c23086de5" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 800.565127] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Acquired lock "refresh_cache-ae88d565-bbf5-4c29-aee9-364c23086de5" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 800.565270] env[59518]: DEBUG nova.network.neutron [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 800.611378] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 37d863b9-bfcb-4d1f-b99b-832276bd640f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 800.611583] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance a894a8af-52b8-4b1c-a5ea-2469f06ea17a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 800.611694] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance b4fb287e-7329-4002-911a-2d1eee138372 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 800.611729] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance a85abea1-8e8d-4007-803d-e36fff55e587 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 800.611827] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance eead3c41-1a63-48f7-941e-24470658ed13 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 800.611950] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 04a58b0b-dfd8-4227-9c10-a69225fa5a53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 800.612114] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 282b61db-76cd-44c3-b500-7a465e903c97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 800.612240] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 3e59b5d7-978d-405a-b68a-47ee03b9a713 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 800.612352] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance af4b8dd9-a05d-427e-a147-76c7cfec5862 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 800.612473] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance ae88d565-bbf5-4c29-aee9-364c23086de5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 800.614248] env[59518]: DEBUG nova.network.neutron [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 800.625095] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance ac3485ac-4817-4492-a196-331002b2cc46 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 800.635369] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance ad83492d-05a6-428d-b343-740c977105f8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 800.645132] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance c4eb8b6b-c958-4d21-9e6c-c1ba6b62c619 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 800.654499] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 2aaaecaa-86c6-4d8c-89a8-7d8e2405d294 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 800.672898] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 3bdabf32-3735-4670-8591-fad410629d95 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 800.682980] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 68abcc97-7992-474e-873c-ce247f4f1bec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 800.695178] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 00a7659b-41f1-4224-a111-01670979c415 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 800.705610] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 7d4fa130-c399-4e8c-a711-33a08ed5dde9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 800.715451] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 2277f51e-d169-416c-b86b-fb8a019a309d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 800.715668] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 800.715810] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 800.841861] env[59518]: DEBUG nova.network.neutron [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Updating instance_info_cache with network_info: [{"id": "29b60c4c-a991-41f5-9b79-066487dc7b6b", "address": "fa:16:3e:d7:a2:d3", "network": {"id": "260717b5-c2b3-4704-9bf5-2caff28d9a2b", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1963611875-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d4e10f0558b44bcd91b960e50278f961", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a06a63d6-2aeb-4084-8022-f804cac3fa74", "external-id": "nsx-vlan-transportzone-797", "segmentation_id": 797, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap29b60c4c-a9", "ovs_interfaceid": "29b60c4c-a991-41f5-9b79-066487dc7b6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 800.854247] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Releasing lock "refresh_cache-ae88d565-bbf5-4c29-aee9-364c23086de5" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 800.854536] env[59518]: DEBUG nova.compute.manager [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Instance network_info: |[{"id": "29b60c4c-a991-41f5-9b79-066487dc7b6b", "address": "fa:16:3e:d7:a2:d3", "network": {"id": "260717b5-c2b3-4704-9bf5-2caff28d9a2b", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1963611875-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d4e10f0558b44bcd91b960e50278f961", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a06a63d6-2aeb-4084-8022-f804cac3fa74", "external-id": "nsx-vlan-transportzone-797", "segmentation_id": 797, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap29b60c4c-a9", "ovs_interfaceid": "29b60c4c-a991-41f5-9b79-066487dc7b6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 800.854883] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d7:a2:d3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a06a63d6-2aeb-4084-8022-f804cac3fa74', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '29b60c4c-a991-41f5-9b79-066487dc7b6b', 'vif_model': 'vmxnet3'}] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 800.863204] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Creating folder: Project (d4e10f0558b44bcd91b960e50278f961). Parent ref: group-v88807. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 800.865911] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9d40217d-d707-4c9f-bbc9-07d5f2f7c8cf {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.880863] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Created folder: Project (d4e10f0558b44bcd91b960e50278f961) in parent group-v88807. [ 800.881063] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Creating folder: Instances. Parent ref: group-v88851. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 800.881497] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f2cf92a0-39c0-4542-9625-b747dbf6ca25 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.890405] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Created folder: Instances in parent group-v88851. [ 800.890637] env[59518]: DEBUG oslo.service.loopingcall [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 800.893067] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 800.893606] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-686dc5ce-2e93-4868-9695-ed77ed935aa4 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.914601] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 800.914601] env[59518]: value = "task-307973" [ 800.914601] env[59518]: _type = "Task" [ 800.914601] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 800.922355] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307973, 'name': CreateVM_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 800.981841] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c3b97cb-ad99-4761-a101-28b25425941f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.989285] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4f2fd64-ff91-4acf-9fe9-468411830c79 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.994600] env[59518]: DEBUG nova.compute.manager [req-db64c8b3-10af-416f-a84e-754866050560 req-e0c914f8-78bf-4efe-b092-f57573f77713 service nova] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Received event network-vif-plugged-29b60c4c-a991-41f5-9b79-066487dc7b6b {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 800.994799] env[59518]: DEBUG oslo_concurrency.lockutils [req-db64c8b3-10af-416f-a84e-754866050560 req-e0c914f8-78bf-4efe-b092-f57573f77713 service nova] Acquiring lock "ae88d565-bbf5-4c29-aee9-364c23086de5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 800.994996] env[59518]: DEBUG oslo_concurrency.lockutils [req-db64c8b3-10af-416f-a84e-754866050560 req-e0c914f8-78bf-4efe-b092-f57573f77713 service nova] Lock "ae88d565-bbf5-4c29-aee9-364c23086de5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 800.995191] env[59518]: DEBUG oslo_concurrency.lockutils [req-db64c8b3-10af-416f-a84e-754866050560 req-e0c914f8-78bf-4efe-b092-f57573f77713 service nova] Lock "ae88d565-bbf5-4c29-aee9-364c23086de5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 800.995365] env[59518]: DEBUG nova.compute.manager [req-db64c8b3-10af-416f-a84e-754866050560 req-e0c914f8-78bf-4efe-b092-f57573f77713 service nova] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] No waiting events found dispatching network-vif-plugged-29b60c4c-a991-41f5-9b79-066487dc7b6b {{(pid=59518) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 800.995521] env[59518]: WARNING nova.compute.manager [req-db64c8b3-10af-416f-a84e-754866050560 req-e0c914f8-78bf-4efe-b092-f57573f77713 service nova] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Received unexpected event network-vif-plugged-29b60c4c-a991-41f5-9b79-066487dc7b6b for instance with vm_state building and task_state spawning. [ 800.995675] env[59518]: DEBUG nova.compute.manager [req-db64c8b3-10af-416f-a84e-754866050560 req-e0c914f8-78bf-4efe-b092-f57573f77713 service nova] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Received event network-changed-29b60c4c-a991-41f5-9b79-066487dc7b6b {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 800.995820] env[59518]: DEBUG nova.compute.manager [req-db64c8b3-10af-416f-a84e-754866050560 req-e0c914f8-78bf-4efe-b092-f57573f77713 service nova] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Refreshing instance network info cache due to event network-changed-29b60c4c-a991-41f5-9b79-066487dc7b6b. {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 800.995993] env[59518]: DEBUG oslo_concurrency.lockutils [req-db64c8b3-10af-416f-a84e-754866050560 req-e0c914f8-78bf-4efe-b092-f57573f77713 service nova] Acquiring lock "refresh_cache-ae88d565-bbf5-4c29-aee9-364c23086de5" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 800.996171] env[59518]: DEBUG oslo_concurrency.lockutils [req-db64c8b3-10af-416f-a84e-754866050560 req-e0c914f8-78bf-4efe-b092-f57573f77713 service nova] Acquired lock "refresh_cache-ae88d565-bbf5-4c29-aee9-364c23086de5" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 800.996319] env[59518]: DEBUG nova.network.neutron [req-db64c8b3-10af-416f-a84e-754866050560 req-e0c914f8-78bf-4efe-b092-f57573f77713 service nova] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Refreshing network info cache for port 29b60c4c-a991-41f5-9b79-066487dc7b6b {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 801.028207] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cb3f9bf-c426-47dc-b0bd-2c12817b27cd {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.036520] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae5c8478-9106-4fde-a661-f590af50f9e1 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.050112] env[59518]: DEBUG nova.compute.provider_tree [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 801.086529] env[59518]: DEBUG nova.scheduler.client.report [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 801.101919] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59518) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 801.102033] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.583s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 801.280189] env[59518]: DEBUG nova.network.neutron [req-db64c8b3-10af-416f-a84e-754866050560 req-e0c914f8-78bf-4efe-b092-f57573f77713 service nova] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Updated VIF entry in instance network info cache for port 29b60c4c-a991-41f5-9b79-066487dc7b6b. {{(pid=59518) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 801.280535] env[59518]: DEBUG nova.network.neutron [req-db64c8b3-10af-416f-a84e-754866050560 req-e0c914f8-78bf-4efe-b092-f57573f77713 service nova] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Updating instance_info_cache with network_info: [{"id": "29b60c4c-a991-41f5-9b79-066487dc7b6b", "address": "fa:16:3e:d7:a2:d3", "network": {"id": "260717b5-c2b3-4704-9bf5-2caff28d9a2b", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1963611875-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d4e10f0558b44bcd91b960e50278f961", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a06a63d6-2aeb-4084-8022-f804cac3fa74", "external-id": "nsx-vlan-transportzone-797", "segmentation_id": 797, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap29b60c4c-a9", "ovs_interfaceid": "29b60c4c-a991-41f5-9b79-066487dc7b6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 801.291095] env[59518]: DEBUG oslo_concurrency.lockutils [req-db64c8b3-10af-416f-a84e-754866050560 req-e0c914f8-78bf-4efe-b092-f57573f77713 service nova] Releasing lock "refresh_cache-ae88d565-bbf5-4c29-aee9-364c23086de5" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 801.425115] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307973, 'name': CreateVM_Task, 'duration_secs': 0.278874} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 801.425277] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 801.425917] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 801.426062] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 801.426356] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 801.426591] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-400e952a-d22d-404c-827e-9c619661db80 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.431263] env[59518]: DEBUG oslo_vmware.api [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Waiting for the task: (returnval){ [ 801.431263] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52676612-70b7-9706-d304-949fb0b6fa55" [ 801.431263] env[59518]: _type = "Task" [ 801.431263] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 801.438976] env[59518]: DEBUG oslo_vmware.api [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52676612-70b7-9706-d304-949fb0b6fa55, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 801.941304] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 801.941591] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 801.941738] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 803.697039] env[59518]: DEBUG nova.compute.manager [req-184e7a69-8c13-4c9d-a312-aeb05f86e0cd req-9073beb7-0bce-4f5a-b47d-992299a9f6ac service nova] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Received event network-vif-deleted-e2be4581-77a6-4a18-8394-62cc4710988c {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 803.939279] env[59518]: DEBUG oslo_concurrency.lockutils [None req-79683d9b-6071-48f3-a544-cf58f8007bbb tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Acquiring lock "a894a8af-52b8-4b1c-a5ea-2469f06ea17a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 804.653524] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquiring lock "935d4358-07b0-423b-8685-26d5bafe9e2f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 804.653524] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Lock "935d4358-07b0-423b-8685-26d5bafe9e2f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 814.221292] env[59518]: DEBUG oslo_concurrency.lockutils [None req-0e50e78e-3606-4804-bbb0-7ab66264cb88 tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquiring lock "b4fb287e-7329-4002-911a-2d1eee138372" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 815.274141] env[59518]: DEBUG oslo_concurrency.lockutils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Acquiring lock "3981aa30-0515-4764-9aac-d0c99a48b064" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 815.274432] env[59518]: DEBUG oslo_concurrency.lockutils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Lock "3981aa30-0515-4764-9aac-d0c99a48b064" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 818.287551] env[59518]: DEBUG oslo_concurrency.lockutils [None req-97aac32e-ac6f-40ba-848a-2ac12c57224d tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Acquiring lock "af4b8dd9-a05d-427e-a147-76c7cfec5862" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 818.344484] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c70b28d4-2bd1-44f2-8366-412347f0cf9d tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Acquiring lock "3e59b5d7-978d-405a-b68a-47ee03b9a713" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 819.701015] env[59518]: DEBUG oslo_concurrency.lockutils [None req-5e7f2c47-9909-46bd-bf47-550c5524d4fe tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Acquiring lock "282b61db-76cd-44c3-b500-7a465e903c97" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 820.026014] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ace62319-0f11-46ef-a2c8-43222a4d7576 tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "04a58b0b-dfd8-4227-9c10-a69225fa5a53" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 820.086022] env[59518]: DEBUG oslo_concurrency.lockutils [None req-5b6a5a77-a1fc-4f8d-a1c6-74c4a91c9b5e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "a85abea1-8e8d-4007-803d-e36fff55e587" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 820.174184] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a0a4b1a6-503e-40bd-8dcf-7d9e29694b2e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "eead3c41-1a63-48f7-941e-24470658ed13" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 820.194157] env[59518]: DEBUG oslo_concurrency.lockutils [None req-49c7338e-23cb-44d3-9826-421416376e5d tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Acquiring lock "ae88d565-bbf5-4c29-aee9-364c23086de5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 847.564848] env[59518]: WARNING oslo_vmware.rw_handles [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 847.564848] env[59518]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 847.564848] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 847.564848] env[59518]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 847.564848] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 847.564848] env[59518]: ERROR oslo_vmware.rw_handles response.begin() [ 847.564848] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 847.564848] env[59518]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 847.564848] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 847.564848] env[59518]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 847.564848] env[59518]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 847.564848] env[59518]: ERROR oslo_vmware.rw_handles [ 847.565475] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Downloaded image file data e70539a9-144d-4900-807e-914ae0cc8539 to vmware_temp/4d0e1f7e-9de8-467f-8d1f-887006e40103/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 847.566968] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Caching image {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 847.567242] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Copying Virtual Disk [datastore1] vmware_temp/4d0e1f7e-9de8-467f-8d1f-887006e40103/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk to [datastore1] vmware_temp/4d0e1f7e-9de8-467f-8d1f-887006e40103/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk {{(pid=59518) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 847.567554] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-522405b6-1e65-4e54-b912-57895dc47cfa {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.575207] env[59518]: DEBUG oslo_vmware.api [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Waiting for the task: (returnval){ [ 847.575207] env[59518]: value = "task-307974" [ 847.575207] env[59518]: _type = "Task" [ 847.575207] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 847.583214] env[59518]: DEBUG oslo_vmware.api [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Task: {'id': task-307974, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 848.085552] env[59518]: DEBUG oslo_vmware.exceptions [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Fault InvalidArgument not matched. {{(pid=59518) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 848.085789] env[59518]: DEBUG oslo_concurrency.lockutils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 848.086339] env[59518]: ERROR nova.compute.manager [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 848.086339] env[59518]: Faults: ['InvalidArgument'] [ 848.086339] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Traceback (most recent call last): [ 848.086339] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 848.086339] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] yield resources [ 848.086339] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 848.086339] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] self.driver.spawn(context, instance, image_meta, [ 848.086339] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 848.086339] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 848.086339] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 848.086339] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] self._fetch_image_if_missing(context, vi) [ 848.086339] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 848.086795] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] image_cache(vi, tmp_image_ds_loc) [ 848.086795] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 848.086795] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] vm_util.copy_virtual_disk( [ 848.086795] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 848.086795] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] session._wait_for_task(vmdk_copy_task) [ 848.086795] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 848.086795] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] return self.wait_for_task(task_ref) [ 848.086795] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 848.086795] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] return evt.wait() [ 848.086795] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 848.086795] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] result = hub.switch() [ 848.086795] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 848.086795] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] return self.greenlet.switch() [ 848.087241] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 848.087241] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] self.f(*self.args, **self.kw) [ 848.087241] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 848.087241] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] raise exceptions.translate_fault(task_info.error) [ 848.087241] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 848.087241] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Faults: ['InvalidArgument'] [ 848.087241] env[59518]: ERROR nova.compute.manager [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] [ 848.087241] env[59518]: INFO nova.compute.manager [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Terminating instance [ 848.088177] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 848.088382] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 848.088616] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-533ec087-d942-4b56-8139-78dd08db74c4 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.091051] env[59518]: DEBUG nova.compute.manager [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 848.091276] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 848.092072] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2510afbd-9625-4440-ae08-68d2218d5770 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.099029] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 848.099230] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e3ec98b4-de67-4317-a6fb-cc3d62d92b19 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.101517] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 848.101701] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 848.102623] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-40e0f933-2555-435c-a31b-4472f7ada411 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.107082] env[59518]: DEBUG oslo_vmware.api [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Waiting for the task: (returnval){ [ 848.107082] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]528ac571-93e9-343b-fa86-0b055ec94129" [ 848.107082] env[59518]: _type = "Task" [ 848.107082] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 848.115067] env[59518]: DEBUG oslo_vmware.api [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]528ac571-93e9-343b-fa86-0b055ec94129, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 848.168112] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 848.168484] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 848.168758] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Deleting the datastore file [datastore1] 37d863b9-bfcb-4d1f-b99b-832276bd640f {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 848.169097] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-da0b20fd-33d8-4266-9903-08817edbf757 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.175354] env[59518]: DEBUG oslo_vmware.api [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Waiting for the task: (returnval){ [ 848.175354] env[59518]: value = "task-307976" [ 848.175354] env[59518]: _type = "Task" [ 848.175354] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 848.182601] env[59518]: DEBUG oslo_vmware.api [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Task: {'id': task-307976, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 848.618383] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 848.618685] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Creating directory with path [datastore1] vmware_temp/b819fd3b-13ed-47b1-8e95-2c2d8bc3c210/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 848.618928] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cbe67a23-6394-4cfd-8142-2a7e00c0cfa4 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.630914] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Created directory with path [datastore1] vmware_temp/b819fd3b-13ed-47b1-8e95-2c2d8bc3c210/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 848.630914] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Fetch image to [datastore1] vmware_temp/b819fd3b-13ed-47b1-8e95-2c2d8bc3c210/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 848.631088] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/b819fd3b-13ed-47b1-8e95-2c2d8bc3c210/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 848.631753] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b8e22ef-9aca-4f34-a474-be7aa058a664 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.638722] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-677e14c2-a313-4aba-8063-41ea06dca153 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.647638] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51f860c8-3e88-4d8a-bcf9-0f4dff815a43 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.681497] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3aeade12-45c6-4ccd-aa80-a25c106144e1 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.688353] env[59518]: DEBUG oslo_vmware.api [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Task: {'id': task-307976, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075492} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 848.689709] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 848.689892] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 848.690056] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 848.691786] env[59518]: INFO nova.compute.manager [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Took 0.60 seconds to destroy the instance on the hypervisor. [ 848.692207] env[59518]: DEBUG nova.compute.claims [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 848.692365] env[59518]: DEBUG oslo_concurrency.lockutils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 848.692574] env[59518]: DEBUG oslo_concurrency.lockutils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 848.696039] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ae599685-c756-4a26-9215-0498db2a0ac2 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.717922] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 848.721707] env[59518]: DEBUG oslo_concurrency.lockutils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.029s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.722427] env[59518]: DEBUG nova.compute.utils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Instance 37d863b9-bfcb-4d1f-b99b-832276bd640f could not be found. {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 848.724042] env[59518]: DEBUG nova.compute.manager [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Instance disappeared during build. {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 848.724229] env[59518]: DEBUG nova.compute.manager [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 848.724859] env[59518]: DEBUG nova.compute.manager [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 848.724859] env[59518]: DEBUG nova.compute.manager [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 848.724859] env[59518]: DEBUG nova.network.neutron [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 848.762831] env[59518]: DEBUG oslo_vmware.rw_handles [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b819fd3b-13ed-47b1-8e95-2c2d8bc3c210/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 848.819779] env[59518]: DEBUG oslo_vmware.rw_handles [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Completed reading data from the image iterator. {{(pid=59518) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 848.819960] env[59518]: DEBUG oslo_vmware.rw_handles [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b819fd3b-13ed-47b1-8e95-2c2d8bc3c210/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 848.822973] env[59518]: DEBUG nova.network.neutron [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 848.831868] env[59518]: INFO nova.compute.manager [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Took 0.11 seconds to deallocate network for instance. [ 848.875723] env[59518]: DEBUG oslo_concurrency.lockutils [None req-bbcf335a-9062-484d-b9c6-a50e441b895b tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Lock "37d863b9-bfcb-4d1f-b99b-832276bd640f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 243.793s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.877072] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "37d863b9-bfcb-4d1f-b99b-832276bd640f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 238.278s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 848.877250] env[59518]: INFO nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] During sync_power_state the instance has a pending task (spawning). Skip. [ 848.877412] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "37d863b9-bfcb-4d1f-b99b-832276bd640f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.883693] env[59518]: DEBUG nova.compute.manager [None req-116c1cff-3a7b-4c00-a467-567411db418e tempest-ServerRescueNegativeTestJSON-1916113752 tempest-ServerRescueNegativeTestJSON-1916113752-project-member] [instance: ec34b663-788a-4d55-aca8-3e139b374f71] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 848.903888] env[59518]: DEBUG nova.compute.manager [None req-116c1cff-3a7b-4c00-a467-567411db418e tempest-ServerRescueNegativeTestJSON-1916113752 tempest-ServerRescueNegativeTestJSON-1916113752-project-member] [instance: ec34b663-788a-4d55-aca8-3e139b374f71] Instance disappeared before build. {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 848.922383] env[59518]: DEBUG oslo_concurrency.lockutils [None req-116c1cff-3a7b-4c00-a467-567411db418e tempest-ServerRescueNegativeTestJSON-1916113752 tempest-ServerRescueNegativeTestJSON-1916113752-project-member] Lock "ec34b663-788a-4d55-aca8-3e139b374f71" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 225.677s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.931462] env[59518]: DEBUG nova.compute.manager [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 848.985054] env[59518]: DEBUG oslo_concurrency.lockutils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 848.985287] env[59518]: DEBUG oslo_concurrency.lockutils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 848.986624] env[59518]: INFO nova.compute.claims [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 849.249021] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6f51d1c-4144-44d0-af14-138d66889423 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.256143] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3351044e-1003-431e-b039-447d1b8378dd {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.284566] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e96a1970-2d0b-4393-be48-52acc7e9cc34 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.291557] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2616856-c77c-4e40-8ba0-e07baefda5bd {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.303898] env[59518]: DEBUG nova.compute.provider_tree [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 849.312500] env[59518]: DEBUG nova.scheduler.client.report [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 849.324849] env[59518]: DEBUG oslo_concurrency.lockutils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.339s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 849.325267] env[59518]: DEBUG nova.compute.manager [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 849.355775] env[59518]: DEBUG nova.compute.utils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 849.357143] env[59518]: DEBUG nova.compute.manager [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Allocating IP information in the background. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 849.357292] env[59518]: DEBUG nova.network.neutron [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] allocate_for_instance() {{(pid=59518) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 849.364830] env[59518]: DEBUG nova.compute.manager [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 849.419560] env[59518]: DEBUG nova.policy [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4130ad3a70294abca2f746d909303daa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3ccdd7632bb54b3cbb7dd36ec079f938', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59518) authorize /opt/stack/nova/nova/policy.py:203}} [ 849.428884] env[59518]: DEBUG nova.compute.manager [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 849.449036] env[59518]: DEBUG nova.virt.hardware [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 849.449258] env[59518]: DEBUG nova.virt.hardware [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 849.449405] env[59518]: DEBUG nova.virt.hardware [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 849.449578] env[59518]: DEBUG nova.virt.hardware [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 849.449713] env[59518]: DEBUG nova.virt.hardware [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 849.449848] env[59518]: DEBUG nova.virt.hardware [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 849.450047] env[59518]: DEBUG nova.virt.hardware [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 849.450199] env[59518]: DEBUG nova.virt.hardware [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 849.450356] env[59518]: DEBUG nova.virt.hardware [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 849.450511] env[59518]: DEBUG nova.virt.hardware [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 849.450677] env[59518]: DEBUG nova.virt.hardware [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 849.451657] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf4b542b-441d-4b91-bb3c-0829cbc82784 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.459236] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fa2a598-95b7-486e-8a70-ef97c2a98c94 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.676856] env[59518]: DEBUG nova.network.neutron [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Successfully created port: 9652bc4f-d225-480e-bd8a-03f76cc97724 {{(pid=59518) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 850.337189] env[59518]: DEBUG nova.compute.manager [req-bb0894c3-e523-43b5-876d-721bff36ab1f req-d5e6a88e-b15d-427c-b3ae-2eb046fac4bb service nova] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Received event network-vif-plugged-9652bc4f-d225-480e-bd8a-03f76cc97724 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 850.337407] env[59518]: DEBUG oslo_concurrency.lockutils [req-bb0894c3-e523-43b5-876d-721bff36ab1f req-d5e6a88e-b15d-427c-b3ae-2eb046fac4bb service nova] Acquiring lock "ac3485ac-4817-4492-a196-331002b2cc46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 850.337604] env[59518]: DEBUG oslo_concurrency.lockutils [req-bb0894c3-e523-43b5-876d-721bff36ab1f req-d5e6a88e-b15d-427c-b3ae-2eb046fac4bb service nova] Lock "ac3485ac-4817-4492-a196-331002b2cc46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 850.337763] env[59518]: DEBUG oslo_concurrency.lockutils [req-bb0894c3-e523-43b5-876d-721bff36ab1f req-d5e6a88e-b15d-427c-b3ae-2eb046fac4bb service nova] Lock "ac3485ac-4817-4492-a196-331002b2cc46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 850.337919] env[59518]: DEBUG nova.compute.manager [req-bb0894c3-e523-43b5-876d-721bff36ab1f req-d5e6a88e-b15d-427c-b3ae-2eb046fac4bb service nova] [instance: ac3485ac-4817-4492-a196-331002b2cc46] No waiting events found dispatching network-vif-plugged-9652bc4f-d225-480e-bd8a-03f76cc97724 {{(pid=59518) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 850.338074] env[59518]: WARNING nova.compute.manager [req-bb0894c3-e523-43b5-876d-721bff36ab1f req-d5e6a88e-b15d-427c-b3ae-2eb046fac4bb service nova] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Received unexpected event network-vif-plugged-9652bc4f-d225-480e-bd8a-03f76cc97724 for instance with vm_state building and task_state spawning. [ 850.432139] env[59518]: DEBUG nova.network.neutron [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Successfully updated port: 9652bc4f-d225-480e-bd8a-03f76cc97724 {{(pid=59518) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 850.443761] env[59518]: DEBUG oslo_concurrency.lockutils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Acquiring lock "refresh_cache-ac3485ac-4817-4492-a196-331002b2cc46" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 850.443893] env[59518]: DEBUG oslo_concurrency.lockutils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Acquired lock "refresh_cache-ac3485ac-4817-4492-a196-331002b2cc46" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 850.444059] env[59518]: DEBUG nova.network.neutron [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 850.479580] env[59518]: DEBUG nova.network.neutron [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 850.656042] env[59518]: DEBUG nova.network.neutron [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Updating instance_info_cache with network_info: [{"id": "9652bc4f-d225-480e-bd8a-03f76cc97724", "address": "fa:16:3e:1b:6b:a0", "network": {"id": "cdabc509-33a4-42ae-82b6-cb4e64be6e60", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-695698360-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3ccdd7632bb54b3cbb7dd36ec079f938", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0dd3c126-9d86-4f9a-b81c-e9627c7a5401", "external-id": "nsx-vlan-transportzone-24", "segmentation_id": 24, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9652bc4f-d2", "ovs_interfaceid": "9652bc4f-d225-480e-bd8a-03f76cc97724", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 850.669126] env[59518]: DEBUG oslo_concurrency.lockutils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Releasing lock "refresh_cache-ac3485ac-4817-4492-a196-331002b2cc46" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 850.669464] env[59518]: DEBUG nova.compute.manager [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Instance network_info: |[{"id": "9652bc4f-d225-480e-bd8a-03f76cc97724", "address": "fa:16:3e:1b:6b:a0", "network": {"id": "cdabc509-33a4-42ae-82b6-cb4e64be6e60", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-695698360-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3ccdd7632bb54b3cbb7dd36ec079f938", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0dd3c126-9d86-4f9a-b81c-e9627c7a5401", "external-id": "nsx-vlan-transportzone-24", "segmentation_id": 24, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9652bc4f-d2", "ovs_interfaceid": "9652bc4f-d225-480e-bd8a-03f76cc97724", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 850.669831] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1b:6b:a0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0dd3c126-9d86-4f9a-b81c-e9627c7a5401', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9652bc4f-d225-480e-bd8a-03f76cc97724', 'vif_model': 'vmxnet3'}] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 850.677239] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Creating folder: Project (3ccdd7632bb54b3cbb7dd36ec079f938). Parent ref: group-v88807. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 850.677842] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7a96acb2-e2e3-44f8-bc80-68c95208ad67 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.689371] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Created folder: Project (3ccdd7632bb54b3cbb7dd36ec079f938) in parent group-v88807. [ 850.689615] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Creating folder: Instances. Parent ref: group-v88854. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 850.689741] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cab929df-9946-472c-9df4-bee1e151f83d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.700551] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Created folder: Instances in parent group-v88854. [ 850.700795] env[59518]: DEBUG oslo.service.loopingcall [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 850.700982] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 850.701216] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-284484df-0f68-4c02-8195-3477fca915e1 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.720928] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 850.720928] env[59518]: value = "task-307979" [ 850.720928] env[59518]: _type = "Task" [ 850.720928] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 850.730102] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307979, 'name': CreateVM_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 851.232348] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307979, 'name': CreateVM_Task} progress is 25%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 851.735376] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307979, 'name': CreateVM_Task} progress is 25%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 852.232644] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307979, 'name': CreateVM_Task} progress is 99%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 852.371735] env[59518]: DEBUG nova.compute.manager [req-90aaefcc-fdf6-4d61-842f-16e2b94ed29a req-cacf2e3d-1f21-4877-9b0b-1e8723eaa863 service nova] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Received event network-changed-9652bc4f-d225-480e-bd8a-03f76cc97724 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 852.371933] env[59518]: DEBUG nova.compute.manager [req-90aaefcc-fdf6-4d61-842f-16e2b94ed29a req-cacf2e3d-1f21-4877-9b0b-1e8723eaa863 service nova] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Refreshing instance network info cache due to event network-changed-9652bc4f-d225-480e-bd8a-03f76cc97724. {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 852.372182] env[59518]: DEBUG oslo_concurrency.lockutils [req-90aaefcc-fdf6-4d61-842f-16e2b94ed29a req-cacf2e3d-1f21-4877-9b0b-1e8723eaa863 service nova] Acquiring lock "refresh_cache-ac3485ac-4817-4492-a196-331002b2cc46" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 852.372321] env[59518]: DEBUG oslo_concurrency.lockutils [req-90aaefcc-fdf6-4d61-842f-16e2b94ed29a req-cacf2e3d-1f21-4877-9b0b-1e8723eaa863 service nova] Acquired lock "refresh_cache-ac3485ac-4817-4492-a196-331002b2cc46" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 852.372619] env[59518]: DEBUG nova.network.neutron [req-90aaefcc-fdf6-4d61-842f-16e2b94ed29a req-cacf2e3d-1f21-4877-9b0b-1e8723eaa863 service nova] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Refreshing network info cache for port 9652bc4f-d225-480e-bd8a-03f76cc97724 {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 852.633593] env[59518]: DEBUG nova.network.neutron [req-90aaefcc-fdf6-4d61-842f-16e2b94ed29a req-cacf2e3d-1f21-4877-9b0b-1e8723eaa863 service nova] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Updated VIF entry in instance network info cache for port 9652bc4f-d225-480e-bd8a-03f76cc97724. {{(pid=59518) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 852.633593] env[59518]: DEBUG nova.network.neutron [req-90aaefcc-fdf6-4d61-842f-16e2b94ed29a req-cacf2e3d-1f21-4877-9b0b-1e8723eaa863 service nova] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Updating instance_info_cache with network_info: [{"id": "9652bc4f-d225-480e-bd8a-03f76cc97724", "address": "fa:16:3e:1b:6b:a0", "network": {"id": "cdabc509-33a4-42ae-82b6-cb4e64be6e60", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-695698360-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3ccdd7632bb54b3cbb7dd36ec079f938", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0dd3c126-9d86-4f9a-b81c-e9627c7a5401", "external-id": "nsx-vlan-transportzone-24", "segmentation_id": 24, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9652bc4f-d2", "ovs_interfaceid": "9652bc4f-d225-480e-bd8a-03f76cc97724", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 852.646269] env[59518]: DEBUG oslo_concurrency.lockutils [req-90aaefcc-fdf6-4d61-842f-16e2b94ed29a req-cacf2e3d-1f21-4877-9b0b-1e8723eaa863 service nova] Releasing lock "refresh_cache-ac3485ac-4817-4492-a196-331002b2cc46" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 852.733538] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307979, 'name': CreateVM_Task, 'duration_secs': 1.549023} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 852.733687] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 852.734273] env[59518]: DEBUG oslo_concurrency.lockutils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 852.734421] env[59518]: DEBUG oslo_concurrency.lockutils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 852.734728] env[59518]: DEBUG oslo_concurrency.lockutils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 852.734958] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d36e6eb0-8aa2-4cee-815b-a1bc30bebf54 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 852.739512] env[59518]: DEBUG oslo_vmware.api [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Waiting for the task: (returnval){ [ 852.739512] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]527be58e-2239-2128-d7e2-14863e285790" [ 852.739512] env[59518]: _type = "Task" [ 852.739512] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 852.747207] env[59518]: DEBUG oslo_vmware.api [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]527be58e-2239-2128-d7e2-14863e285790, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 853.249365] env[59518]: DEBUG oslo_concurrency.lockutils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 853.249365] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 853.249538] env[59518]: DEBUG oslo_concurrency.lockutils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 854.447744] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 854.448022] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Cleaning up deleted instances {{(pid=59518) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11095}} [ 854.463946] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] There are 1 instances to clean {{(pid=59518) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11104}} [ 854.464222] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 37d863b9-bfcb-4d1f-b99b-832276bd640f] Instance has had 0 of 5 cleanup attempts {{(pid=59518) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 854.497863] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 854.498012] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Cleaning up deleted instances with incomplete migration {{(pid=59518) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11133}} [ 854.506000] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 855.511760] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 857.447005] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 858.448434] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 859.442852] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 859.447466] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 859.447623] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Starting heal instance info cache {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 859.447738] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Rebuilding the list of instances to heal {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 859.466859] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 859.467128] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 859.467128] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 859.467298] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 859.467380] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 859.467486] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 859.467635] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 859.467779] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 859.467897] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 859.468032] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 859.468145] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Didn't find any instances for network info cache update. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 859.468591] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 859.468755] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 859.468881] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59518) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 860.448708] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 862.448554] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 862.458214] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 862.458421] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 862.458572] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 862.458719] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59518) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 862.459803] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d4acb9f-5446-4171-88e7-dd5e55c4a924 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 862.469455] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a276f36f-d76b-4fab-94b1-7a7900a5f967 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 862.483300] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb54ca37-4042-4d51-8719-44adcd1890e9 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 862.489701] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97da875f-c5ab-472b-aa68-cb717a46c8b0 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 862.520724] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181708MB free_disk=174GB free_vcpus=48 pci_devices=None {{(pid=59518) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 862.520810] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 862.521012] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 862.686932] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance a894a8af-52b8-4b1c-a5ea-2469f06ea17a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 862.687140] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance b4fb287e-7329-4002-911a-2d1eee138372 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 862.687287] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance a85abea1-8e8d-4007-803d-e36fff55e587 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 862.687403] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance eead3c41-1a63-48f7-941e-24470658ed13 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 862.687514] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 04a58b0b-dfd8-4227-9c10-a69225fa5a53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 862.687622] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 282b61db-76cd-44c3-b500-7a465e903c97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 862.687730] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 3e59b5d7-978d-405a-b68a-47ee03b9a713 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 862.687840] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance af4b8dd9-a05d-427e-a147-76c7cfec5862 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 862.687949] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance ae88d565-bbf5-4c29-aee9-364c23086de5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 862.688074] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance ac3485ac-4817-4492-a196-331002b2cc46 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 862.699173] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance ad83492d-05a6-428d-b343-740c977105f8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 862.709379] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance c4eb8b6b-c958-4d21-9e6c-c1ba6b62c619 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 862.719411] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 2aaaecaa-86c6-4d8c-89a8-7d8e2405d294 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 862.728450] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 3bdabf32-3735-4670-8591-fad410629d95 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 862.737589] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 68abcc97-7992-474e-873c-ce247f4f1bec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 862.746660] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 00a7659b-41f1-4224-a111-01670979c415 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 862.756708] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 7d4fa130-c399-4e8c-a711-33a08ed5dde9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 862.766142] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 2277f51e-d169-416c-b86b-fb8a019a309d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 862.775629] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 935d4358-07b0-423b-8685-26d5bafe9e2f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 862.784564] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 3981aa30-0515-4764-9aac-d0c99a48b064 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 862.784789] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 862.784942] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 862.800319] env[59518]: DEBUG nova.scheduler.client.report [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Refreshing inventories for resource provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 862.814051] env[59518]: DEBUG nova.scheduler.client.report [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Updating ProviderTree inventory for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 862.814225] env[59518]: DEBUG nova.compute.provider_tree [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Updating inventory in ProviderTree for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 862.823986] env[59518]: DEBUG nova.scheduler.client.report [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Refreshing aggregate associations for resource provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd, aggregates: None {{(pid=59518) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 862.838231] env[59518]: DEBUG nova.scheduler.client.report [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Refreshing trait associations for resource provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE {{(pid=59518) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 863.048055] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63fb8273-01fa-4ab2-b5d7-60a895e016b1 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.055362] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28df1acc-4b84-4157-a657-f4d9cd1cd467 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.085061] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c32bab3-c148-4f0f-a2b9-5ea9ef921fa2 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.092256] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29dcbc2c-45da-46a1-9e88-d7cdfdb43005 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.106934] env[59518]: DEBUG nova.compute.provider_tree [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 863.115097] env[59518]: DEBUG nova.scheduler.client.report [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 863.127819] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59518) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 863.127988] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.607s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 880.965255] env[59518]: DEBUG nova.compute.manager [req-5ec6efa7-456a-4d3c-9f03-a86a2e9923a9 req-d5bbc894-0d08-4f8a-a8cc-625ea83c2904 service nova] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Received event network-vif-deleted-9652bc4f-d225-480e-bd8a-03f76cc97724 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 897.159256] env[59518]: WARNING oslo_vmware.rw_handles [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 897.159256] env[59518]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 897.159256] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 897.159256] env[59518]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 897.159256] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 897.159256] env[59518]: ERROR oslo_vmware.rw_handles response.begin() [ 897.159256] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 897.159256] env[59518]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 897.159256] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 897.159256] env[59518]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 897.159256] env[59518]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 897.159256] env[59518]: ERROR oslo_vmware.rw_handles [ 897.159909] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Downloaded image file data e70539a9-144d-4900-807e-914ae0cc8539 to vmware_temp/b819fd3b-13ed-47b1-8e95-2c2d8bc3c210/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 897.161214] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Caching image {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 897.161460] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Copying Virtual Disk [datastore1] vmware_temp/b819fd3b-13ed-47b1-8e95-2c2d8bc3c210/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk to [datastore1] vmware_temp/b819fd3b-13ed-47b1-8e95-2c2d8bc3c210/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk {{(pid=59518) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 897.161738] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d66cc169-850c-4f45-b808-0831f943383f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.170132] env[59518]: DEBUG oslo_vmware.api [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Waiting for the task: (returnval){ [ 897.170132] env[59518]: value = "task-307980" [ 897.170132] env[59518]: _type = "Task" [ 897.170132] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 897.177701] env[59518]: DEBUG oslo_vmware.api [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Task: {'id': task-307980, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 897.681030] env[59518]: DEBUG oslo_vmware.exceptions [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Fault InvalidArgument not matched. {{(pid=59518) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 897.681284] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 897.681821] env[59518]: ERROR nova.compute.manager [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 897.681821] env[59518]: Faults: ['InvalidArgument'] [ 897.681821] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Traceback (most recent call last): [ 897.681821] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 897.681821] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] yield resources [ 897.681821] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 897.681821] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] self.driver.spawn(context, instance, image_meta, [ 897.681821] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 897.681821] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 897.681821] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 897.681821] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] self._fetch_image_if_missing(context, vi) [ 897.681821] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 897.682310] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] image_cache(vi, tmp_image_ds_loc) [ 897.682310] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 897.682310] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] vm_util.copy_virtual_disk( [ 897.682310] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 897.682310] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] session._wait_for_task(vmdk_copy_task) [ 897.682310] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 897.682310] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] return self.wait_for_task(task_ref) [ 897.682310] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 897.682310] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] return evt.wait() [ 897.682310] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 897.682310] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] result = hub.switch() [ 897.682310] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 897.682310] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] return self.greenlet.switch() [ 897.682812] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 897.682812] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] self.f(*self.args, **self.kw) [ 897.682812] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 897.682812] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] raise exceptions.translate_fault(task_info.error) [ 897.682812] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 897.682812] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Faults: ['InvalidArgument'] [ 897.682812] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] [ 897.682812] env[59518]: INFO nova.compute.manager [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Terminating instance [ 897.683680] env[59518]: DEBUG oslo_concurrency.lockutils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 897.683907] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 897.684556] env[59518]: DEBUG nova.compute.manager [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 897.684726] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 897.684936] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9323a918-3455-48a1-ae60-e02cf63cc9c3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.687632] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c54e081f-b651-44f7-9e62-a721046aac0e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.695751] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 897.696034] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-80cfcbba-e58b-44f5-8ebd-e647df5095e7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.699026] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 897.699188] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 897.700243] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d6ecd015-3641-496b-8506-3313065fc175 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.708280] env[59518]: DEBUG oslo_vmware.api [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Waiting for the task: (returnval){ [ 897.708280] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52458db1-5a08-b9ca-bc33-c29c0ca7b6ea" [ 897.708280] env[59518]: _type = "Task" [ 897.708280] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 897.720954] env[59518]: DEBUG oslo_vmware.api [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52458db1-5a08-b9ca-bc33-c29c0ca7b6ea, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 897.769327] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 897.769541] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 897.769710] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Deleting the datastore file [datastore1] a894a8af-52b8-4b1c-a5ea-2469f06ea17a {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 897.769962] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2396410d-8916-43f5-9060-4f511ea1b70a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.776320] env[59518]: DEBUG oslo_vmware.api [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Waiting for the task: (returnval){ [ 897.776320] env[59518]: value = "task-307982" [ 897.776320] env[59518]: _type = "Task" [ 897.776320] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 897.785683] env[59518]: DEBUG oslo_vmware.api [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Task: {'id': task-307982, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 898.218482] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 898.218728] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Creating directory with path [datastore1] vmware_temp/2ee164d9-b6d6-4d3c-8d4f-a5b35f57a4b8/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 898.218915] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-97be2311-6409-456c-a52b-bd7be247c57e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.231330] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Created directory with path [datastore1] vmware_temp/2ee164d9-b6d6-4d3c-8d4f-a5b35f57a4b8/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 898.231519] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Fetch image to [datastore1] vmware_temp/2ee164d9-b6d6-4d3c-8d4f-a5b35f57a4b8/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 898.231677] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/2ee164d9-b6d6-4d3c-8d4f-a5b35f57a4b8/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 898.232488] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-802f102e-4b00-4a97-a609-f7ab362bed9c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.239264] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed31392b-1c3e-4964-9b2e-d1c184abb0ec {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.248818] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f0c6325-e6ea-45b2-af4a-c28d002d49d6 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.283113] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1e5741b-8532-4157-aa9d-f85624044baa {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.290335] env[59518]: DEBUG oslo_vmware.api [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Task: {'id': task-307982, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064811} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 898.291807] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 898.291998] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 898.292202] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 898.292367] env[59518]: INFO nova.compute.manager [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Took 0.61 seconds to destroy the instance on the hypervisor. [ 898.294472] env[59518]: DEBUG nova.compute.claims [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 898.294589] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 898.294720] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 898.297013] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-bd4b2835-8f0a-4a19-8b90-435cf3f5c5dc {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.319238] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 898.369883] env[59518]: DEBUG oslo_vmware.rw_handles [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2ee164d9-b6d6-4d3c-8d4f-a5b35f57a4b8/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 898.428381] env[59518]: DEBUG oslo_vmware.rw_handles [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Completed reading data from the image iterator. {{(pid=59518) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 898.428551] env[59518]: DEBUG oslo_vmware.rw_handles [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2ee164d9-b6d6-4d3c-8d4f-a5b35f57a4b8/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 898.592375] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34b6b3d4-3c8e-4068-8b1c-f44265e6b775 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.601604] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-813d3adf-ce64-40dd-832b-f16c2d7a2906 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.634839] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b48b72de-0a0c-4cf9-8293-dbb643714760 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.642079] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16e90f02-8845-4314-aef3-8b4bebc9aeb5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.659908] env[59518]: DEBUG nova.compute.provider_tree [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 898.670187] env[59518]: DEBUG nova.scheduler.client.report [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 898.689536] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.395s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 898.690151] env[59518]: ERROR nova.compute.manager [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 898.690151] env[59518]: Faults: ['InvalidArgument'] [ 898.690151] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Traceback (most recent call last): [ 898.690151] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 898.690151] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] self.driver.spawn(context, instance, image_meta, [ 898.690151] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 898.690151] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 898.690151] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 898.690151] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] self._fetch_image_if_missing(context, vi) [ 898.690151] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 898.690151] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] image_cache(vi, tmp_image_ds_loc) [ 898.690151] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 898.690498] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] vm_util.copy_virtual_disk( [ 898.690498] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 898.690498] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] session._wait_for_task(vmdk_copy_task) [ 898.690498] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 898.690498] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] return self.wait_for_task(task_ref) [ 898.690498] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 898.690498] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] return evt.wait() [ 898.690498] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 898.690498] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] result = hub.switch() [ 898.690498] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 898.690498] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] return self.greenlet.switch() [ 898.690498] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 898.690498] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] self.f(*self.args, **self.kw) [ 898.690948] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 898.690948] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] raise exceptions.translate_fault(task_info.error) [ 898.690948] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 898.690948] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Faults: ['InvalidArgument'] [ 898.690948] env[59518]: ERROR nova.compute.manager [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] [ 898.690948] env[59518]: DEBUG nova.compute.utils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] VimFaultException {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 898.692261] env[59518]: DEBUG nova.compute.manager [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Build of instance a894a8af-52b8-4b1c-a5ea-2469f06ea17a was re-scheduled: A specified parameter was not correct: fileType [ 898.692261] env[59518]: Faults: ['InvalidArgument'] {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 898.692638] env[59518]: DEBUG nova.compute.manager [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 898.692796] env[59518]: DEBUG nova.compute.manager [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 898.692938] env[59518]: DEBUG nova.compute.manager [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 898.693151] env[59518]: DEBUG nova.network.neutron [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 899.199149] env[59518]: DEBUG nova.network.neutron [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 899.209720] env[59518]: INFO nova.compute.manager [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Took 0.52 seconds to deallocate network for instance. [ 899.310029] env[59518]: INFO nova.scheduler.client.report [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Deleted allocations for instance a894a8af-52b8-4b1c-a5ea-2469f06ea17a [ 899.330350] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4cff99c2-afc6-43e8-aa27-b7090bb01b9e tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Lock "a894a8af-52b8-4b1c-a5ea-2469f06ea17a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 293.251s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 899.331522] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "a894a8af-52b8-4b1c-a5ea-2469f06ea17a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 288.732s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 899.331699] env[59518]: INFO nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] During sync_power_state the instance has a pending task (spawning). Skip. [ 899.331859] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "a894a8af-52b8-4b1c-a5ea-2469f06ea17a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 899.332313] env[59518]: DEBUG oslo_concurrency.lockutils [None req-79683d9b-6071-48f3-a544-cf58f8007bbb tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Lock "a894a8af-52b8-4b1c-a5ea-2469f06ea17a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 95.393s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 899.332510] env[59518]: DEBUG oslo_concurrency.lockutils [None req-79683d9b-6071-48f3-a544-cf58f8007bbb tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Acquiring lock "a894a8af-52b8-4b1c-a5ea-2469f06ea17a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 899.332703] env[59518]: DEBUG oslo_concurrency.lockutils [None req-79683d9b-6071-48f3-a544-cf58f8007bbb tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Lock "a894a8af-52b8-4b1c-a5ea-2469f06ea17a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 899.332884] env[59518]: DEBUG oslo_concurrency.lockutils [None req-79683d9b-6071-48f3-a544-cf58f8007bbb tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Lock "a894a8af-52b8-4b1c-a5ea-2469f06ea17a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 899.335155] env[59518]: INFO nova.compute.manager [None req-79683d9b-6071-48f3-a544-cf58f8007bbb tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Terminating instance [ 899.336737] env[59518]: DEBUG nova.compute.manager [None req-79683d9b-6071-48f3-a544-cf58f8007bbb tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 899.336915] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-79683d9b-6071-48f3-a544-cf58f8007bbb tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 899.337808] env[59518]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f2cc1193-5292-4bbf-8a83-4fa17ce95722 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 899.347049] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ecc126b-56ff-48e0-8399-ba659bff1cf3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 899.359336] env[59518]: DEBUG nova.compute.manager [None req-c0b5d5aa-bf34-416c-98c9-a0b836996290 tempest-AttachVolumeShelveTestJSON-165117633 tempest-AttachVolumeShelveTestJSON-165117633-project-member] [instance: ad83492d-05a6-428d-b343-740c977105f8] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 899.380775] env[59518]: WARNING nova.virt.vmwareapi.vmops [None req-79683d9b-6071-48f3-a544-cf58f8007bbb tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a894a8af-52b8-4b1c-a5ea-2469f06ea17a could not be found. [ 899.380985] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-79683d9b-6071-48f3-a544-cf58f8007bbb tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 899.381180] env[59518]: INFO nova.compute.manager [None req-79683d9b-6071-48f3-a544-cf58f8007bbb tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 899.381410] env[59518]: DEBUG oslo.service.loopingcall [None req-79683d9b-6071-48f3-a544-cf58f8007bbb tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 899.381628] env[59518]: DEBUG nova.compute.manager [-] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 899.381720] env[59518]: DEBUG nova.network.neutron [-] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 899.387015] env[59518]: DEBUG nova.compute.manager [None req-c0b5d5aa-bf34-416c-98c9-a0b836996290 tempest-AttachVolumeShelveTestJSON-165117633 tempest-AttachVolumeShelveTestJSON-165117633-project-member] [instance: ad83492d-05a6-428d-b343-740c977105f8] Instance disappeared before build. {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 899.406959] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c0b5d5aa-bf34-416c-98c9-a0b836996290 tempest-AttachVolumeShelveTestJSON-165117633 tempest-AttachVolumeShelveTestJSON-165117633-project-member] Lock "ad83492d-05a6-428d-b343-740c977105f8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.278s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 899.417874] env[59518]: DEBUG nova.compute.manager [None req-24c28a7d-c7e9-499f-81f5-2d33b6d14bf4 tempest-ServerRescueTestJSON-1806638401 tempest-ServerRescueTestJSON-1806638401-project-member] [instance: c4eb8b6b-c958-4d21-9e6c-c1ba6b62c619] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 899.458572] env[59518]: DEBUG nova.compute.manager [None req-24c28a7d-c7e9-499f-81f5-2d33b6d14bf4 tempest-ServerRescueTestJSON-1806638401 tempest-ServerRescueTestJSON-1806638401-project-member] [instance: c4eb8b6b-c958-4d21-9e6c-c1ba6b62c619] Instance disappeared before build. {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 899.464092] env[59518]: DEBUG nova.network.neutron [-] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 899.470760] env[59518]: INFO nova.compute.manager [-] [instance: a894a8af-52b8-4b1c-a5ea-2469f06ea17a] Took 0.09 seconds to deallocate network for instance. [ 899.496827] env[59518]: DEBUG oslo_concurrency.lockutils [None req-24c28a7d-c7e9-499f-81f5-2d33b6d14bf4 tempest-ServerRescueTestJSON-1806638401 tempest-ServerRescueTestJSON-1806638401-project-member] Lock "c4eb8b6b-c958-4d21-9e6c-c1ba6b62c619" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.502s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 899.506737] env[59518]: DEBUG nova.compute.manager [None req-edbf377f-1109-47fb-8663-576991df6c29 tempest-ServersNegativeTestMultiTenantJSON-1906642749 tempest-ServersNegativeTestMultiTenantJSON-1906642749-project-member] [instance: 2aaaecaa-86c6-4d8c-89a8-7d8e2405d294] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 899.554134] env[59518]: DEBUG nova.compute.manager [None req-edbf377f-1109-47fb-8663-576991df6c29 tempest-ServersNegativeTestMultiTenantJSON-1906642749 tempest-ServersNegativeTestMultiTenantJSON-1906642749-project-member] [instance: 2aaaecaa-86c6-4d8c-89a8-7d8e2405d294] Instance disappeared before build. {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 899.583444] env[59518]: DEBUG oslo_concurrency.lockutils [None req-edbf377f-1109-47fb-8663-576991df6c29 tempest-ServersNegativeTestMultiTenantJSON-1906642749 tempest-ServersNegativeTestMultiTenantJSON-1906642749-project-member] Lock "2aaaecaa-86c6-4d8c-89a8-7d8e2405d294" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.515s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 899.604856] env[59518]: DEBUG nova.compute.manager [None req-fe5a9076-47ad-49dd-9a3c-9ca5e96c3b51 tempest-ServerAddressesTestJSON-29904488 tempest-ServerAddressesTestJSON-29904488-project-member] [instance: 3bdabf32-3735-4670-8591-fad410629d95] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 899.648482] env[59518]: DEBUG nova.compute.manager [None req-fe5a9076-47ad-49dd-9a3c-9ca5e96c3b51 tempest-ServerAddressesTestJSON-29904488 tempest-ServerAddressesTestJSON-29904488-project-member] [instance: 3bdabf32-3735-4670-8591-fad410629d95] Instance disappeared before build. {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 899.665204] env[59518]: DEBUG oslo_concurrency.lockutils [None req-79683d9b-6071-48f3-a544-cf58f8007bbb tempest-ServerExternalEventsTest-2122356162 tempest-ServerExternalEventsTest-2122356162-project-member] Lock "a894a8af-52b8-4b1c-a5ea-2469f06ea17a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.333s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 899.675266] env[59518]: DEBUG oslo_concurrency.lockutils [None req-fe5a9076-47ad-49dd-9a3c-9ca5e96c3b51 tempest-ServerAddressesTestJSON-29904488 tempest-ServerAddressesTestJSON-29904488-project-member] Lock "3bdabf32-3735-4670-8591-fad410629d95" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.217s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 899.685275] env[59518]: DEBUG nova.compute.manager [None req-fde14fbb-4e46-46c1-87d5-0dab64b86a3a tempest-ServerGroupTestJSON-923375532 tempest-ServerGroupTestJSON-923375532-project-member] [instance: 68abcc97-7992-474e-873c-ce247f4f1bec] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 899.716848] env[59518]: DEBUG nova.compute.manager [None req-fde14fbb-4e46-46c1-87d5-0dab64b86a3a tempest-ServerGroupTestJSON-923375532 tempest-ServerGroupTestJSON-923375532-project-member] [instance: 68abcc97-7992-474e-873c-ce247f4f1bec] Instance disappeared before build. {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 899.739432] env[59518]: DEBUG oslo_concurrency.lockutils [None req-fde14fbb-4e46-46c1-87d5-0dab64b86a3a tempest-ServerGroupTestJSON-923375532 tempest-ServerGroupTestJSON-923375532-project-member] Lock "68abcc97-7992-474e-873c-ce247f4f1bec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.732s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 899.750216] env[59518]: DEBUG nova.compute.manager [None req-b0269c41-6fb0-4392-b1b8-cc70fdaf886c tempest-ServersTestBootFromVolume-813210871 tempest-ServersTestBootFromVolume-813210871-project-member] [instance: 00a7659b-41f1-4224-a111-01670979c415] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 899.781702] env[59518]: DEBUG nova.compute.manager [None req-b0269c41-6fb0-4392-b1b8-cc70fdaf886c tempest-ServersTestBootFromVolume-813210871 tempest-ServersTestBootFromVolume-813210871-project-member] [instance: 00a7659b-41f1-4224-a111-01670979c415] Instance disappeared before build. {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 899.813446] env[59518]: DEBUG oslo_concurrency.lockutils [None req-b0269c41-6fb0-4392-b1b8-cc70fdaf886c tempest-ServersTestBootFromVolume-813210871 tempest-ServersTestBootFromVolume-813210871-project-member] Lock "00a7659b-41f1-4224-a111-01670979c415" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.297s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 899.835654] env[59518]: DEBUG nova.compute.manager [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 899.894557] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 899.894723] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 899.896330] env[59518]: INFO nova.compute.claims [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 900.143064] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93f92b29-3982-4d99-b73a-14d356e0db04 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.151990] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe711309-b35a-4be5-bf2b-79a0a685f030 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.184573] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1a78c49-8b74-4f05-bcdc-f84e48de59ff {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.192880] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9371b41b-69fb-4d10-9383-d7d41860a05a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.209768] env[59518]: DEBUG nova.compute.provider_tree [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 900.237731] env[59518]: DEBUG nova.scheduler.client.report [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 900.237731] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.342s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 900.237731] env[59518]: DEBUG nova.compute.manager [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 900.273633] env[59518]: DEBUG nova.compute.utils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 900.275207] env[59518]: DEBUG nova.compute.manager [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Allocating IP information in the background. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 900.275373] env[59518]: DEBUG nova.network.neutron [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] allocate_for_instance() {{(pid=59518) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 900.290227] env[59518]: DEBUG nova.compute.manager [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 900.392047] env[59518]: DEBUG nova.compute.manager [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 900.412811] env[59518]: DEBUG nova.virt.hardware [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 900.413177] env[59518]: DEBUG nova.virt.hardware [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 900.413412] env[59518]: DEBUG nova.virt.hardware [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 900.413637] env[59518]: DEBUG nova.virt.hardware [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 900.413812] env[59518]: DEBUG nova.virt.hardware [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 900.414269] env[59518]: DEBUG nova.virt.hardware [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 900.414348] env[59518]: DEBUG nova.virt.hardware [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 900.414489] env[59518]: DEBUG nova.virt.hardware [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 900.414789] env[59518]: DEBUG nova.virt.hardware [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 900.414997] env[59518]: DEBUG nova.virt.hardware [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 900.415201] env[59518]: DEBUG nova.virt.hardware [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 900.416079] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-939d35ed-6226-4473-aad5-1d972a911a68 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.429419] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cde4c31e-1638-43f2-b5a5-5f7103362941 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.530809] env[59518]: DEBUG nova.policy [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '34d1251a0db64dc7a4a20085390672a3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7b766a9205774740bbff73e46bd3b905', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59518) authorize /opt/stack/nova/nova/policy.py:203}} [ 900.900928] env[59518]: DEBUG nova.network.neutron [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Successfully created port: b658b06d-96c6-40e3-92f2-3f42a66289c5 {{(pid=59518) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 901.674196] env[59518]: DEBUG nova.network.neutron [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Successfully updated port: b658b06d-96c6-40e3-92f2-3f42a66289c5 {{(pid=59518) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 901.688112] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquiring lock "refresh_cache-7d4fa130-c399-4e8c-a711-33a08ed5dde9" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 901.688256] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquired lock "refresh_cache-7d4fa130-c399-4e8c-a711-33a08ed5dde9" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 901.689044] env[59518]: DEBUG nova.network.neutron [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 901.839515] env[59518]: DEBUG nova.network.neutron [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 901.839723] env[59518]: DEBUG nova.network.neutron [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 901.853585] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Releasing lock "refresh_cache-7d4fa130-c399-4e8c-a711-33a08ed5dde9" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 901.853585] env[59518]: DEBUG nova.compute.manager [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Instance network_info: |[]| {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 901.853585] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Instance VIF info [] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 901.859076] env[59518]: DEBUG oslo.service.loopingcall [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 901.859613] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 901.859787] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6c550c86-6b03-4fe4-be5f-185950952a1d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 901.881907] env[59518]: DEBUG nova.compute.manager [req-7feafed1-d675-4a43-964b-cdb6db573a30 req-8a203796-da42-4cf0-8681-2d7c9bb137c1 service nova] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Received event network-changed-b658b06d-96c6-40e3-92f2-3f42a66289c5 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 901.882106] env[59518]: DEBUG nova.compute.manager [req-7feafed1-d675-4a43-964b-cdb6db573a30 req-8a203796-da42-4cf0-8681-2d7c9bb137c1 service nova] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Refreshing instance network info cache due to event network-changed-b658b06d-96c6-40e3-92f2-3f42a66289c5. {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 901.882328] env[59518]: DEBUG oslo_concurrency.lockutils [req-7feafed1-d675-4a43-964b-cdb6db573a30 req-8a203796-da42-4cf0-8681-2d7c9bb137c1 service nova] Acquiring lock "refresh_cache-7d4fa130-c399-4e8c-a711-33a08ed5dde9" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 901.882460] env[59518]: DEBUG oslo_concurrency.lockutils [req-7feafed1-d675-4a43-964b-cdb6db573a30 req-8a203796-da42-4cf0-8681-2d7c9bb137c1 service nova] Acquired lock "refresh_cache-7d4fa130-c399-4e8c-a711-33a08ed5dde9" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 901.882608] env[59518]: DEBUG nova.network.neutron [req-7feafed1-d675-4a43-964b-cdb6db573a30 req-8a203796-da42-4cf0-8681-2d7c9bb137c1 service nova] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Refreshing network info cache for port b658b06d-96c6-40e3-92f2-3f42a66289c5 {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 901.890678] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 901.890678] env[59518]: value = "task-307983" [ 901.890678] env[59518]: _type = "Task" [ 901.890678] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 901.899846] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307983, 'name': CreateVM_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 901.920054] env[59518]: DEBUG nova.network.neutron [req-7feafed1-d675-4a43-964b-cdb6db573a30 req-8a203796-da42-4cf0-8681-2d7c9bb137c1 service nova] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 902.036385] env[59518]: DEBUG nova.network.neutron [req-7feafed1-d675-4a43-964b-cdb6db573a30 req-8a203796-da42-4cf0-8681-2d7c9bb137c1 service nova] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Instance is deleted, no further info cache update {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:106}} [ 902.036553] env[59518]: DEBUG oslo_concurrency.lockutils [req-7feafed1-d675-4a43-964b-cdb6db573a30 req-8a203796-da42-4cf0-8681-2d7c9bb137c1 service nova] Releasing lock "refresh_cache-7d4fa130-c399-4e8c-a711-33a08ed5dde9" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 902.400917] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307983, 'name': CreateVM_Task, 'duration_secs': 0.252252} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 902.401212] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 902.401617] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 902.401765] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 902.402094] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 902.402331] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-eb4f9159-cdb0-4128-89d7-6430b0aa827e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 902.414653] env[59518]: DEBUG oslo_vmware.api [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Waiting for the task: (returnval){ [ 902.414653] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]520bd469-a3ae-17d9-ceef-4ea490a313bc" [ 902.414653] env[59518]: _type = "Task" [ 902.414653] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 902.422122] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 902.422548] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 902.422882] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 917.128439] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 917.444670] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 919.448179] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 920.448270] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 920.448593] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Starting heal instance info cache {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 920.448593] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Rebuilding the list of instances to heal {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 920.465798] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 920.465945] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 920.466115] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 920.466185] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 920.466307] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 920.466426] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 920.466539] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 920.466658] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 920.466777] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Didn't find any instances for network info cache update. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 920.467215] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 920.467469] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 920.467693] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 920.467922] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 920.468087] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59518) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 921.463778] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 922.448936] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 922.457738] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 922.457839] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 922.457943] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 922.458096] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59518) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 922.459157] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8f9f805-2880-4761-880a-dc4a947652b1 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 922.468155] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-809a52a2-43d2-43bd-af36-2e7653dd93e2 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 922.483019] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f66e641-19f6-47a7-b45e-4a5bfaac0c1a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 922.488901] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d32d68fc-5e26-4244-8909-641f929fd221 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 922.517693] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181779MB free_disk=174GB free_vcpus=48 pci_devices=None {{(pid=59518) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 922.517693] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 922.517693] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 922.581476] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance b4fb287e-7329-4002-911a-2d1eee138372 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 922.581476] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance a85abea1-8e8d-4007-803d-e36fff55e587 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 922.581476] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance eead3c41-1a63-48f7-941e-24470658ed13 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 922.581476] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 04a58b0b-dfd8-4227-9c10-a69225fa5a53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 922.581698] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 282b61db-76cd-44c3-b500-7a465e903c97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 922.581698] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 3e59b5d7-978d-405a-b68a-47ee03b9a713 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 922.581698] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance af4b8dd9-a05d-427e-a147-76c7cfec5862 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 922.581698] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance ae88d565-bbf5-4c29-aee9-364c23086de5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 922.592254] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 935d4358-07b0-423b-8685-26d5bafe9e2f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 922.603963] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 3981aa30-0515-4764-9aac-d0c99a48b064 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 922.604221] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 922.604356] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 922.719605] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1afb90b6-289a-4904-999d-2689414fd74f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 922.726834] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9250ba23-3d86-4cb6-9493-c177435f1de8 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 922.757323] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6753ef57-cb49-4d1a-b117-b74a235b28e4 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 922.764448] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bd723de-95fa-42df-ad0f-ad24476ba953 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 922.776969] env[59518]: DEBUG nova.compute.provider_tree [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 922.785533] env[59518]: DEBUG nova.scheduler.client.report [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 922.797713] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59518) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 922.797908] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.280s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 947.597567] env[59518]: WARNING oslo_vmware.rw_handles [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 947.597567] env[59518]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 947.597567] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 947.597567] env[59518]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 947.597567] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 947.597567] env[59518]: ERROR oslo_vmware.rw_handles response.begin() [ 947.597567] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 947.597567] env[59518]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 947.597567] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 947.597567] env[59518]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 947.597567] env[59518]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 947.597567] env[59518]: ERROR oslo_vmware.rw_handles [ 947.598203] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Downloaded image file data e70539a9-144d-4900-807e-914ae0cc8539 to vmware_temp/2ee164d9-b6d6-4d3c-8d4f-a5b35f57a4b8/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 947.600020] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Caching image {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 947.600020] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Copying Virtual Disk [datastore1] vmware_temp/2ee164d9-b6d6-4d3c-8d4f-a5b35f57a4b8/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk to [datastore1] vmware_temp/2ee164d9-b6d6-4d3c-8d4f-a5b35f57a4b8/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk {{(pid=59518) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 947.600299] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a6a5e5f3-9af6-42aa-a115-69a25fe0395f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.607508] env[59518]: DEBUG oslo_vmware.api [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Waiting for the task: (returnval){ [ 947.607508] env[59518]: value = "task-307984" [ 947.607508] env[59518]: _type = "Task" [ 947.607508] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 947.615134] env[59518]: DEBUG oslo_vmware.api [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Task: {'id': task-307984, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 948.117239] env[59518]: DEBUG oslo_vmware.exceptions [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Fault InvalidArgument not matched. {{(pid=59518) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 948.117479] env[59518]: DEBUG oslo_concurrency.lockutils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 948.118025] env[59518]: ERROR nova.compute.manager [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 948.118025] env[59518]: Faults: ['InvalidArgument'] [ 948.118025] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] Traceback (most recent call last): [ 948.118025] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 948.118025] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] yield resources [ 948.118025] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 948.118025] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] self.driver.spawn(context, instance, image_meta, [ 948.118025] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 948.118025] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] self._vmops.spawn(context, instance, image_meta, injected_files, [ 948.118025] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 948.118025] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] self._fetch_image_if_missing(context, vi) [ 948.118025] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 948.118430] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] image_cache(vi, tmp_image_ds_loc) [ 948.118430] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 948.118430] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] vm_util.copy_virtual_disk( [ 948.118430] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 948.118430] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] session._wait_for_task(vmdk_copy_task) [ 948.118430] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 948.118430] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] return self.wait_for_task(task_ref) [ 948.118430] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 948.118430] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] return evt.wait() [ 948.118430] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 948.118430] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] result = hub.switch() [ 948.118430] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 948.118430] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] return self.greenlet.switch() [ 948.118824] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 948.118824] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] self.f(*self.args, **self.kw) [ 948.118824] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 948.118824] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] raise exceptions.translate_fault(task_info.error) [ 948.118824] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 948.118824] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] Faults: ['InvalidArgument'] [ 948.118824] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] [ 948.118824] env[59518]: INFO nova.compute.manager [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Terminating instance [ 948.119907] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 948.120117] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 948.120345] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e3bcdd76-6c5c-4cb4-89b4-5c8a08d691ed {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.122447] env[59518]: DEBUG nova.compute.manager [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 948.122622] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 948.123300] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d75503f3-6050-48d5-89b5-bf6579c61a21 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.129511] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 948.129703] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1ad70f1a-7e47-494d-94a3-a6512e945fdd {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.131670] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 948.131829] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 948.132743] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e230ce7c-ec78-459d-8a52-27904716eae3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.137180] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Waiting for the task: (returnval){ [ 948.137180] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]5297f1a9-991f-328b-16b3-c319e07840b2" [ 948.137180] env[59518]: _type = "Task" [ 948.137180] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 948.145462] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]5297f1a9-991f-328b-16b3-c319e07840b2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 948.199643] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 948.199872] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 948.200107] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Deleting the datastore file [datastore1] b4fb287e-7329-4002-911a-2d1eee138372 {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 948.200393] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-91e5f9be-2093-4caa-b140-b3882433db44 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.206369] env[59518]: DEBUG oslo_vmware.api [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Waiting for the task: (returnval){ [ 948.206369] env[59518]: value = "task-307986" [ 948.206369] env[59518]: _type = "Task" [ 948.206369] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 948.213745] env[59518]: DEBUG oslo_vmware.api [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Task: {'id': task-307986, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 948.646616] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 948.646892] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Creating directory with path [datastore1] vmware_temp/69b359d1-a6ce-4adb-9d46-7a345bb3b714/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 948.647097] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ec72f938-2cd7-40a7-8297-1a8b7a80a68a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.658919] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Created directory with path [datastore1] vmware_temp/69b359d1-a6ce-4adb-9d46-7a345bb3b714/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 948.659135] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Fetch image to [datastore1] vmware_temp/69b359d1-a6ce-4adb-9d46-7a345bb3b714/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 948.659297] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/69b359d1-a6ce-4adb-9d46-7a345bb3b714/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 948.660074] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fbed7e6-1d19-4d01-852e-7eab8324b018 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.666922] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6d0a841-1c3b-4444-bb1d-2602228180e0 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.676955] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2301d08-c5ba-4c97-8a0b-132c31e21e58 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.706520] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7637f3d0-47b9-47b0-b00b-03baa48ff340 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.717014] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-49231a22-47e7-469f-bdf0-2afba4515183 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.718585] env[59518]: DEBUG oslo_vmware.api [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Task: {'id': task-307986, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077117} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 948.718797] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 948.718968] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 948.719129] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 948.719290] env[59518]: INFO nova.compute.manager [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Took 0.60 seconds to destroy the instance on the hypervisor. [ 948.721398] env[59518]: DEBUG nova.compute.claims [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 948.721573] env[59518]: DEBUG oslo_concurrency.lockutils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 948.721773] env[59518]: DEBUG oslo_concurrency.lockutils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 948.803164] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 948.850596] env[59518]: DEBUG oslo_vmware.rw_handles [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/69b359d1-a6ce-4adb-9d46-7a345bb3b714/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 948.904816] env[59518]: DEBUG oslo_vmware.rw_handles [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Completed reading data from the image iterator. {{(pid=59518) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 948.904989] env[59518]: DEBUG oslo_vmware.rw_handles [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/69b359d1-a6ce-4adb-9d46-7a345bb3b714/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 948.924946] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1925901c-cef7-4875-aa1f-bf545b3f94b3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.932313] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b108562-0e8b-4570-91e1-1477594c53e4 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.962580] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82488e9c-ee9b-454e-8948-c17fbc0c9d40 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.969112] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e39d84ef-3ff0-4be1-9459-34ff483070a2 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.982037] env[59518]: DEBUG nova.compute.provider_tree [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 948.989989] env[59518]: DEBUG nova.scheduler.client.report [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 949.005397] env[59518]: DEBUG oslo_concurrency.lockutils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.284s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 949.005944] env[59518]: ERROR nova.compute.manager [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 949.005944] env[59518]: Faults: ['InvalidArgument'] [ 949.005944] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] Traceback (most recent call last): [ 949.005944] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 949.005944] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] self.driver.spawn(context, instance, image_meta, [ 949.005944] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 949.005944] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] self._vmops.spawn(context, instance, image_meta, injected_files, [ 949.005944] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 949.005944] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] self._fetch_image_if_missing(context, vi) [ 949.005944] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 949.005944] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] image_cache(vi, tmp_image_ds_loc) [ 949.005944] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 949.006356] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] vm_util.copy_virtual_disk( [ 949.006356] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 949.006356] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] session._wait_for_task(vmdk_copy_task) [ 949.006356] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 949.006356] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] return self.wait_for_task(task_ref) [ 949.006356] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 949.006356] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] return evt.wait() [ 949.006356] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 949.006356] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] result = hub.switch() [ 949.006356] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 949.006356] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] return self.greenlet.switch() [ 949.006356] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 949.006356] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] self.f(*self.args, **self.kw) [ 949.006781] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 949.006781] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] raise exceptions.translate_fault(task_info.error) [ 949.006781] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 949.006781] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] Faults: ['InvalidArgument'] [ 949.006781] env[59518]: ERROR nova.compute.manager [instance: b4fb287e-7329-4002-911a-2d1eee138372] [ 949.006781] env[59518]: DEBUG nova.compute.utils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] VimFaultException {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 949.007921] env[59518]: DEBUG nova.compute.manager [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Build of instance b4fb287e-7329-4002-911a-2d1eee138372 was re-scheduled: A specified parameter was not correct: fileType [ 949.007921] env[59518]: Faults: ['InvalidArgument'] {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 949.008294] env[59518]: DEBUG nova.compute.manager [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 949.008460] env[59518]: DEBUG nova.compute.manager [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 949.008657] env[59518]: DEBUG nova.compute.manager [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 949.008837] env[59518]: DEBUG nova.network.neutron [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 949.251502] env[59518]: DEBUG nova.network.neutron [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 949.261952] env[59518]: INFO nova.compute.manager [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Took 0.25 seconds to deallocate network for instance. [ 949.348205] env[59518]: INFO nova.scheduler.client.report [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Deleted allocations for instance b4fb287e-7329-4002-911a-2d1eee138372 [ 949.370711] env[59518]: DEBUG oslo_concurrency.lockutils [None req-d16a8c55-4811-4914-8b23-77cdaaa7af8b tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Lock "b4fb287e-7329-4002-911a-2d1eee138372" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 333.322s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 949.371928] env[59518]: DEBUG oslo_concurrency.lockutils [None req-0e50e78e-3606-4804-bbb0-7ab66264cb88 tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Lock "b4fb287e-7329-4002-911a-2d1eee138372" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 135.151s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 949.372281] env[59518]: DEBUG oslo_concurrency.lockutils [None req-0e50e78e-3606-4804-bbb0-7ab66264cb88 tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquiring lock "b4fb287e-7329-4002-911a-2d1eee138372-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 949.372525] env[59518]: DEBUG oslo_concurrency.lockutils [None req-0e50e78e-3606-4804-bbb0-7ab66264cb88 tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Lock "b4fb287e-7329-4002-911a-2d1eee138372-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 949.372690] env[59518]: DEBUG oslo_concurrency.lockutils [None req-0e50e78e-3606-4804-bbb0-7ab66264cb88 tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Lock "b4fb287e-7329-4002-911a-2d1eee138372-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 949.374705] env[59518]: INFO nova.compute.manager [None req-0e50e78e-3606-4804-bbb0-7ab66264cb88 tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Terminating instance [ 949.376293] env[59518]: DEBUG nova.compute.manager [None req-0e50e78e-3606-4804-bbb0-7ab66264cb88 tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 949.376473] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-0e50e78e-3606-4804-bbb0-7ab66264cb88 tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 949.376894] env[59518]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1ffb2ed2-e90b-4847-a554-4409a3fbd1c2 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.385326] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdac03e3-bd83-4237-8371-7440d4fdf3b9 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.395659] env[59518]: DEBUG nova.compute.manager [None req-5ceb658d-bff5-410e-817f-25ac83db0abf tempest-ServerActionsV293TestJSON-855670052 tempest-ServerActionsV293TestJSON-855670052-project-member] [instance: 2277f51e-d169-416c-b86b-fb8a019a309d] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 949.414740] env[59518]: WARNING nova.virt.vmwareapi.vmops [None req-0e50e78e-3606-4804-bbb0-7ab66264cb88 tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b4fb287e-7329-4002-911a-2d1eee138372 could not be found. [ 949.414927] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-0e50e78e-3606-4804-bbb0-7ab66264cb88 tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 949.415090] env[59518]: INFO nova.compute.manager [None req-0e50e78e-3606-4804-bbb0-7ab66264cb88 tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Took 0.04 seconds to destroy the instance on the hypervisor. [ 949.415314] env[59518]: DEBUG oslo.service.loopingcall [None req-0e50e78e-3606-4804-bbb0-7ab66264cb88 tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 949.415505] env[59518]: DEBUG nova.compute.manager [-] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 949.415595] env[59518]: DEBUG nova.network.neutron [-] [instance: b4fb287e-7329-4002-911a-2d1eee138372] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 949.417986] env[59518]: DEBUG nova.compute.manager [None req-5ceb658d-bff5-410e-817f-25ac83db0abf tempest-ServerActionsV293TestJSON-855670052 tempest-ServerActionsV293TestJSON-855670052-project-member] [instance: 2277f51e-d169-416c-b86b-fb8a019a309d] Instance disappeared before build. {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 949.436973] env[59518]: DEBUG oslo_concurrency.lockutils [None req-5ceb658d-bff5-410e-817f-25ac83db0abf tempest-ServerActionsV293TestJSON-855670052 tempest-ServerActionsV293TestJSON-855670052-project-member] Lock "2277f51e-d169-416c-b86b-fb8a019a309d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 233.398s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 949.438272] env[59518]: DEBUG nova.network.neutron [-] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 949.444873] env[59518]: DEBUG nova.compute.manager [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 949.447370] env[59518]: INFO nova.compute.manager [-] [instance: b4fb287e-7329-4002-911a-2d1eee138372] Took 0.03 seconds to deallocate network for instance. [ 949.488761] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 949.488991] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 949.490386] env[59518]: INFO nova.compute.claims [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 949.556244] env[59518]: DEBUG oslo_concurrency.lockutils [None req-0e50e78e-3606-4804-bbb0-7ab66264cb88 tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Lock "b4fb287e-7329-4002-911a-2d1eee138372" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.184s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 949.643115] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-995adfc6-895a-4ff0-b630-c2b0f6b243a3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.651113] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e802810d-1805-4c81-bedc-a738a161fa08 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.681020] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7209a3d1-3438-407f-b758-4c07ea611557 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.687965] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07ad221c-30b2-4af8-ad39-08b03558601b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.701157] env[59518]: DEBUG nova.compute.provider_tree [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 949.711039] env[59518]: DEBUG nova.scheduler.client.report [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 949.726702] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.238s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 949.727195] env[59518]: DEBUG nova.compute.manager [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 949.778955] env[59518]: DEBUG nova.compute.utils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 949.780229] env[59518]: DEBUG nova.compute.manager [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Allocating IP information in the background. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 949.780392] env[59518]: DEBUG nova.network.neutron [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] allocate_for_instance() {{(pid=59518) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 949.789329] env[59518]: DEBUG nova.compute.manager [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 949.849698] env[59518]: DEBUG nova.compute.manager [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 949.864329] env[59518]: DEBUG nova.policy [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5b893dfae76248ec98ab38c6abb6047c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd3476ab8778c40218c4b2b54e1297f19', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59518) authorize /opt/stack/nova/nova/policy.py:203}} [ 949.878312] env[59518]: DEBUG nova.virt.hardware [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:20:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9e74b609-2486-45af-b854-10c8e489c14d',id=38,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-1739269515',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 949.878549] env[59518]: DEBUG nova.virt.hardware [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 949.878700] env[59518]: DEBUG nova.virt.hardware [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 949.878877] env[59518]: DEBUG nova.virt.hardware [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 949.879030] env[59518]: DEBUG nova.virt.hardware [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 949.879170] env[59518]: DEBUG nova.virt.hardware [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 949.879362] env[59518]: DEBUG nova.virt.hardware [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 949.879509] env[59518]: DEBUG nova.virt.hardware [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 949.879662] env[59518]: DEBUG nova.virt.hardware [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 949.879815] env[59518]: DEBUG nova.virt.hardware [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 949.879978] env[59518]: DEBUG nova.virt.hardware [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 949.880823] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a61d7de-f9f0-4ccf-9246-e157232445e3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.888710] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-642c6296-0d23-4e81-a9c7-fde263c3dd5f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 950.050740] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquiring lock "39cfe606-43a0-4a52-8ec1-433baf7a3aec" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 950.050963] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Lock "39cfe606-43a0-4a52-8ec1-433baf7a3aec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 950.124292] env[59518]: DEBUG nova.network.neutron [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Successfully created port: 6b2feb20-91d6-4694-972e-e6de127559fb {{(pid=59518) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 950.557493] env[59518]: DEBUG nova.network.neutron [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Successfully updated port: 6b2feb20-91d6-4694-972e-e6de127559fb {{(pid=59518) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 950.565701] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquiring lock "refresh_cache-935d4358-07b0-423b-8685-26d5bafe9e2f" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 950.565837] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquired lock "refresh_cache-935d4358-07b0-423b-8685-26d5bafe9e2f" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 950.565979] env[59518]: DEBUG nova.network.neutron [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 950.595923] env[59518]: DEBUG nova.network.neutron [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 950.716318] env[59518]: DEBUG nova.network.neutron [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Updating instance_info_cache with network_info: [{"id": "6b2feb20-91d6-4694-972e-e6de127559fb", "address": "fa:16:3e:0f:cd:b9", "network": {"id": "d9098ce0-cf4b-4e0f-b9b7-b613f8d5b1d5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.108", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "33bab75ff2cf45ecb4ab54af3adf83ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2", "external-id": "nsx-vlan-transportzone-546", "segmentation_id": 546, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b2feb20-91", "ovs_interfaceid": "6b2feb20-91d6-4694-972e-e6de127559fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 950.726507] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Releasing lock "refresh_cache-935d4358-07b0-423b-8685-26d5bafe9e2f" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 950.728021] env[59518]: DEBUG nova.compute.manager [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Instance network_info: |[{"id": "6b2feb20-91d6-4694-972e-e6de127559fb", "address": "fa:16:3e:0f:cd:b9", "network": {"id": "d9098ce0-cf4b-4e0f-b9b7-b613f8d5b1d5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.108", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "33bab75ff2cf45ecb4ab54af3adf83ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2", "external-id": "nsx-vlan-transportzone-546", "segmentation_id": 546, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b2feb20-91", "ovs_interfaceid": "6b2feb20-91d6-4694-972e-e6de127559fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 950.728152] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0f:cd:b9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6b2feb20-91d6-4694-972e-e6de127559fb', 'vif_model': 'vmxnet3'}] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 950.734485] env[59518]: DEBUG oslo.service.loopingcall [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 950.734880] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 950.735088] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1b7c2e4e-77a0-4e98-8974-2b310ec21176 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 950.754647] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 950.754647] env[59518]: value = "task-307987" [ 950.754647] env[59518]: _type = "Task" [ 950.754647] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 950.762048] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307987, 'name': CreateVM_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 951.266523] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307987, 'name': CreateVM_Task, 'duration_secs': 0.297373} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 951.267867] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 951.267867] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 951.267867] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 951.267867] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 951.268070] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c5762a30-f467-483f-b24f-e08fe1d20f04 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 951.272853] env[59518]: DEBUG oslo_vmware.api [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Waiting for the task: (returnval){ [ 951.272853] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52a657fa-0502-81b4-56c1-ea4f5bcf1440" [ 951.272853] env[59518]: _type = "Task" [ 951.272853] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 951.280181] env[59518]: DEBUG oslo_vmware.api [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52a657fa-0502-81b4-56c1-ea4f5bcf1440, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 951.385787] env[59518]: DEBUG nova.compute.manager [req-a4047304-9610-4e06-8444-71cbfd73e7d4 req-2ca64f13-90e2-4248-9697-df20eac52bed service nova] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Received event network-vif-plugged-6b2feb20-91d6-4694-972e-e6de127559fb {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 951.386000] env[59518]: DEBUG oslo_concurrency.lockutils [req-a4047304-9610-4e06-8444-71cbfd73e7d4 req-2ca64f13-90e2-4248-9697-df20eac52bed service nova] Acquiring lock "935d4358-07b0-423b-8685-26d5bafe9e2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 951.386199] env[59518]: DEBUG oslo_concurrency.lockutils [req-a4047304-9610-4e06-8444-71cbfd73e7d4 req-2ca64f13-90e2-4248-9697-df20eac52bed service nova] Lock "935d4358-07b0-423b-8685-26d5bafe9e2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.386354] env[59518]: DEBUG oslo_concurrency.lockutils [req-a4047304-9610-4e06-8444-71cbfd73e7d4 req-2ca64f13-90e2-4248-9697-df20eac52bed service nova] Lock "935d4358-07b0-423b-8685-26d5bafe9e2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 951.386555] env[59518]: DEBUG nova.compute.manager [req-a4047304-9610-4e06-8444-71cbfd73e7d4 req-2ca64f13-90e2-4248-9697-df20eac52bed service nova] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] No waiting events found dispatching network-vif-plugged-6b2feb20-91d6-4694-972e-e6de127559fb {{(pid=59518) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 951.386678] env[59518]: WARNING nova.compute.manager [req-a4047304-9610-4e06-8444-71cbfd73e7d4 req-2ca64f13-90e2-4248-9697-df20eac52bed service nova] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Received unexpected event network-vif-plugged-6b2feb20-91d6-4694-972e-e6de127559fb for instance with vm_state building and task_state spawning. [ 951.386790] env[59518]: DEBUG nova.compute.manager [req-a4047304-9610-4e06-8444-71cbfd73e7d4 req-2ca64f13-90e2-4248-9697-df20eac52bed service nova] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Received event network-changed-6b2feb20-91d6-4694-972e-e6de127559fb {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 951.386954] env[59518]: DEBUG nova.compute.manager [req-a4047304-9610-4e06-8444-71cbfd73e7d4 req-2ca64f13-90e2-4248-9697-df20eac52bed service nova] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Refreshing instance network info cache due to event network-changed-6b2feb20-91d6-4694-972e-e6de127559fb. {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 951.387092] env[59518]: DEBUG oslo_concurrency.lockutils [req-a4047304-9610-4e06-8444-71cbfd73e7d4 req-2ca64f13-90e2-4248-9697-df20eac52bed service nova] Acquiring lock "refresh_cache-935d4358-07b0-423b-8685-26d5bafe9e2f" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 951.387221] env[59518]: DEBUG oslo_concurrency.lockutils [req-a4047304-9610-4e06-8444-71cbfd73e7d4 req-2ca64f13-90e2-4248-9697-df20eac52bed service nova] Acquired lock "refresh_cache-935d4358-07b0-423b-8685-26d5bafe9e2f" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 951.387362] env[59518]: DEBUG nova.network.neutron [req-a4047304-9610-4e06-8444-71cbfd73e7d4 req-2ca64f13-90e2-4248-9697-df20eac52bed service nova] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Refreshing network info cache for port 6b2feb20-91d6-4694-972e-e6de127559fb {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 951.574509] env[59518]: DEBUG nova.network.neutron [req-a4047304-9610-4e06-8444-71cbfd73e7d4 req-2ca64f13-90e2-4248-9697-df20eac52bed service nova] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Updated VIF entry in instance network info cache for port 6b2feb20-91d6-4694-972e-e6de127559fb. {{(pid=59518) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 951.574824] env[59518]: DEBUG nova.network.neutron [req-a4047304-9610-4e06-8444-71cbfd73e7d4 req-2ca64f13-90e2-4248-9697-df20eac52bed service nova] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Updating instance_info_cache with network_info: [{"id": "6b2feb20-91d6-4694-972e-e6de127559fb", "address": "fa:16:3e:0f:cd:b9", "network": {"id": "d9098ce0-cf4b-4e0f-b9b7-b613f8d5b1d5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.108", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "33bab75ff2cf45ecb4ab54af3adf83ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaf3dfa2-fa01-4d4d-8ecd-a9bc74d90ec2", "external-id": "nsx-vlan-transportzone-546", "segmentation_id": 546, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6b2feb20-91", "ovs_interfaceid": "6b2feb20-91d6-4694-972e-e6de127559fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 951.584870] env[59518]: DEBUG oslo_concurrency.lockutils [req-a4047304-9610-4e06-8444-71cbfd73e7d4 req-2ca64f13-90e2-4248-9697-df20eac52bed service nova] Releasing lock "refresh_cache-935d4358-07b0-423b-8685-26d5bafe9e2f" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 951.783111] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 951.783435] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 951.783685] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 976.799033] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 980.448729] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 980.449149] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 980.449149] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59518) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 981.448553] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 981.448726] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Starting heal instance info cache {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 981.448848] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Rebuilding the list of instances to heal {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 981.467381] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 981.467526] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 981.467652] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 981.467772] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 981.467890] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 981.468012] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 981.468138] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 981.468257] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 981.468374] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Didn't find any instances for network info cache update. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 982.448036] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 982.448274] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 982.448430] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 982.448570] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 984.448559] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 984.458073] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 984.458292] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 984.458441] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 984.458587] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59518) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 984.459656] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41c7c2e9-3362-438c-a925-47f466fd612c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 984.468151] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83f2f100-c4d3-4ae9-9b00-39827291c2c6 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 984.481701] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9f78ab2-2942-4d18-936e-a36034f67e42 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 984.487791] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f14e854-8867-47f7-ae73-aae9dc595c64 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 984.517516] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181777MB free_disk=174GB free_vcpus=48 pci_devices=None {{(pid=59518) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 984.517666] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 984.517862] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 984.575941] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance a85abea1-8e8d-4007-803d-e36fff55e587 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 984.576107] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance eead3c41-1a63-48f7-941e-24470658ed13 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 984.576225] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 04a58b0b-dfd8-4227-9c10-a69225fa5a53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 984.576343] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 282b61db-76cd-44c3-b500-7a465e903c97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 984.576455] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 3e59b5d7-978d-405a-b68a-47ee03b9a713 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 984.576594] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance af4b8dd9-a05d-427e-a147-76c7cfec5862 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 984.576730] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance ae88d565-bbf5-4c29-aee9-364c23086de5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 984.576841] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 935d4358-07b0-423b-8685-26d5bafe9e2f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 984.587238] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 3981aa30-0515-4764-9aac-d0c99a48b064 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 984.596930] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 39cfe606-43a0-4a52-8ec1-433baf7a3aec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 984.597142] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 984.597288] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 984.716376] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9d49dc0-c0ee-4e6b-9076-05cd4da3c534 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 984.724361] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91ec225c-44ff-434e-b97d-15aa59254f3f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 984.753795] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78fccac3-ff7a-4d4b-843e-20a205a4bd6a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 984.760348] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f676c3c0-b591-4c72-b448-0181be0db686 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 984.774320] env[59518]: DEBUG nova.compute.provider_tree [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 984.782924] env[59518]: DEBUG nova.scheduler.client.report [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 984.795068] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59518) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 984.795232] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.277s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 996.831362] env[59518]: WARNING oslo_vmware.rw_handles [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 996.831362] env[59518]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 996.831362] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 996.831362] env[59518]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 996.831362] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 996.831362] env[59518]: ERROR oslo_vmware.rw_handles response.begin() [ 996.831362] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 996.831362] env[59518]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 996.831362] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 996.831362] env[59518]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 996.831362] env[59518]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 996.831362] env[59518]: ERROR oslo_vmware.rw_handles [ 996.831967] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Downloaded image file data e70539a9-144d-4900-807e-914ae0cc8539 to vmware_temp/69b359d1-a6ce-4adb-9d46-7a345bb3b714/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 996.833435] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Caching image {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 996.833668] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Copying Virtual Disk [datastore1] vmware_temp/69b359d1-a6ce-4adb-9d46-7a345bb3b714/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk to [datastore1] vmware_temp/69b359d1-a6ce-4adb-9d46-7a345bb3b714/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk {{(pid=59518) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 996.833934] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-cd06562a-6d25-4f38-9d81-968f7c5c4a28 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.842020] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Waiting for the task: (returnval){ [ 996.842020] env[59518]: value = "task-307988" [ 996.842020] env[59518]: _type = "Task" [ 996.842020] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 996.849879] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Task: {'id': task-307988, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 997.353104] env[59518]: DEBUG oslo_vmware.exceptions [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Fault InvalidArgument not matched. {{(pid=59518) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 997.353364] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 997.353950] env[59518]: ERROR nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 997.353950] env[59518]: Faults: ['InvalidArgument'] [ 997.353950] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] Traceback (most recent call last): [ 997.353950] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 997.353950] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] yield resources [ 997.353950] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 997.353950] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] self.driver.spawn(context, instance, image_meta, [ 997.353950] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 997.353950] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] self._vmops.spawn(context, instance, image_meta, injected_files, [ 997.353950] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 997.353950] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] self._fetch_image_if_missing(context, vi) [ 997.353950] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 997.354228] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] image_cache(vi, tmp_image_ds_loc) [ 997.354228] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 997.354228] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] vm_util.copy_virtual_disk( [ 997.354228] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 997.354228] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] session._wait_for_task(vmdk_copy_task) [ 997.354228] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 997.354228] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] return self.wait_for_task(task_ref) [ 997.354228] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 997.354228] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] return evt.wait() [ 997.354228] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 997.354228] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] result = hub.switch() [ 997.354228] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 997.354228] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] return self.greenlet.switch() [ 997.354498] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 997.354498] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] self.f(*self.args, **self.kw) [ 997.354498] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 997.354498] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] raise exceptions.translate_fault(task_info.error) [ 997.354498] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 997.354498] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] Faults: ['InvalidArgument'] [ 997.354498] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] [ 997.354498] env[59518]: INFO nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Terminating instance [ 997.356426] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 997.356633] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 997.357275] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 997.357465] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 997.357693] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-77ab1fc8-7419-47e9-b14b-d7875f821f36 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.361383] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93646aab-0f77-4fa4-a5a8-280fee9e51b7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.367862] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 997.368066] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-62beec1a-5912-4bd5-b675-0c3922a81cb4 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.370183] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 997.370346] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 997.371258] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4a9ace02-724b-4afe-9934-c15185a56fe0 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.376013] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Waiting for the task: (returnval){ [ 997.376013] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52f3ac85-a91e-812a-ba8f-f67be4df8f38" [ 997.376013] env[59518]: _type = "Task" [ 997.376013] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 997.382776] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52f3ac85-a91e-812a-ba8f-f67be4df8f38, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 997.431134] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 997.431359] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 997.431527] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Deleting the datastore file [datastore1] eead3c41-1a63-48f7-941e-24470658ed13 {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 997.431782] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-13f17290-4286-40bf-bee0-be114a50bee3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.437964] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Waiting for the task: (returnval){ [ 997.437964] env[59518]: value = "task-307990" [ 997.437964] env[59518]: _type = "Task" [ 997.437964] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 997.445423] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Task: {'id': task-307990, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 997.886373] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 997.886706] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Creating directory with path [datastore1] vmware_temp/8b5ae967-99a9-42f7-b805-57ce670a3591/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 997.886834] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4e34418f-dd70-466c-8d20-45166ccbd993 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.898301] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Created directory with path [datastore1] vmware_temp/8b5ae967-99a9-42f7-b805-57ce670a3591/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 997.898481] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Fetch image to [datastore1] vmware_temp/8b5ae967-99a9-42f7-b805-57ce670a3591/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 997.898641] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/8b5ae967-99a9-42f7-b805-57ce670a3591/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 997.899385] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-611c6702-72e0-40e7-b173-43efb57c4b03 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.905902] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a2318c6-dfd6-4e90-9969-915266dddbee {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.914863] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b1e645b-87e2-44d2-8bda-bdfc6adf1bdd {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.947265] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25666439-0e26-4163-bfad-f7e37f5dc971 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.953927] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Task: {'id': task-307990, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074677} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 997.955400] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 997.955614] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 997.955785] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 997.955949] env[59518]: INFO nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Took 0.60 seconds to destroy the instance on the hypervisor. [ 997.957687] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1c781272-082d-4ca8-995b-e7db90b6308b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.959508] env[59518]: DEBUG nova.compute.claims [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 997.959660] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 997.959891] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 997.982369] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 998.029222] env[59518]: DEBUG oslo_vmware.rw_handles [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8b5ae967-99a9-42f7-b805-57ce670a3591/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 998.087682] env[59518]: DEBUG oslo_vmware.rw_handles [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Completed reading data from the image iterator. {{(pid=59518) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 998.087791] env[59518]: DEBUG oslo_vmware.rw_handles [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8b5ae967-99a9-42f7-b805-57ce670a3591/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 998.172012] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f3a5863-d13d-4264-8679-5ff700c73806 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.177633] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4af57265-4ec0-46b7-b65f-a46129ba3d8e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.205591] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d33c66a2-1cfe-4ca6-adaf-461c28f13675 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.211983] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c926f59b-4d58-48b2-891f-71f99235d118 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.224296] env[59518]: DEBUG nova.compute.provider_tree [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 998.232880] env[59518]: DEBUG nova.scheduler.client.report [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 998.245401] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.285s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 998.245896] env[59518]: ERROR nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 998.245896] env[59518]: Faults: ['InvalidArgument'] [ 998.245896] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] Traceback (most recent call last): [ 998.245896] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 998.245896] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] self.driver.spawn(context, instance, image_meta, [ 998.245896] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 998.245896] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] self._vmops.spawn(context, instance, image_meta, injected_files, [ 998.245896] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 998.245896] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] self._fetch_image_if_missing(context, vi) [ 998.245896] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 998.245896] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] image_cache(vi, tmp_image_ds_loc) [ 998.245896] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 998.246200] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] vm_util.copy_virtual_disk( [ 998.246200] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 998.246200] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] session._wait_for_task(vmdk_copy_task) [ 998.246200] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 998.246200] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] return self.wait_for_task(task_ref) [ 998.246200] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 998.246200] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] return evt.wait() [ 998.246200] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 998.246200] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] result = hub.switch() [ 998.246200] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 998.246200] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] return self.greenlet.switch() [ 998.246200] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 998.246200] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] self.f(*self.args, **self.kw) [ 998.246524] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 998.246524] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] raise exceptions.translate_fault(task_info.error) [ 998.246524] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 998.246524] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] Faults: ['InvalidArgument'] [ 998.246524] env[59518]: ERROR nova.compute.manager [instance: eead3c41-1a63-48f7-941e-24470658ed13] [ 998.246653] env[59518]: DEBUG nova.compute.utils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] VimFaultException {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 998.247822] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Build of instance eead3c41-1a63-48f7-941e-24470658ed13 was re-scheduled: A specified parameter was not correct: fileType [ 998.247822] env[59518]: Faults: ['InvalidArgument'] {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 998.248203] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 998.248368] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 998.248527] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 998.248680] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 998.452797] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 998.462239] env[59518]: INFO nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Took 0.21 seconds to deallocate network for instance. [ 998.545721] env[59518]: INFO nova.scheduler.client.report [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Deleted allocations for instance eead3c41-1a63-48f7-941e-24470658ed13 [ 998.562736] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "eead3c41-1a63-48f7-941e-24470658ed13" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 380.593s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 998.563767] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a0a4b1a6-503e-40bd-8dcf-7d9e29694b2e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "eead3c41-1a63-48f7-941e-24470658ed13" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 178.390s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 998.563976] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a0a4b1a6-503e-40bd-8dcf-7d9e29694b2e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "eead3c41-1a63-48f7-941e-24470658ed13-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 998.564196] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a0a4b1a6-503e-40bd-8dcf-7d9e29694b2e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "eead3c41-1a63-48f7-941e-24470658ed13-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 998.564352] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a0a4b1a6-503e-40bd-8dcf-7d9e29694b2e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "eead3c41-1a63-48f7-941e-24470658ed13-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 998.566591] env[59518]: INFO nova.compute.manager [None req-a0a4b1a6-503e-40bd-8dcf-7d9e29694b2e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Terminating instance [ 998.568432] env[59518]: DEBUG nova.compute.manager [None req-a0a4b1a6-503e-40bd-8dcf-7d9e29694b2e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 998.568624] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-a0a4b1a6-503e-40bd-8dcf-7d9e29694b2e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 998.569085] env[59518]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-26bc940c-888c-404f-931e-4da33be3ff68 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.580214] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-351e6775-459b-488a-8096-a46c99521dab {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.591305] env[59518]: DEBUG nova.compute.manager [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 998.610296] env[59518]: WARNING nova.virt.vmwareapi.vmops [None req-a0a4b1a6-503e-40bd-8dcf-7d9e29694b2e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance eead3c41-1a63-48f7-941e-24470658ed13 could not be found. [ 998.610507] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-a0a4b1a6-503e-40bd-8dcf-7d9e29694b2e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 998.610672] env[59518]: INFO nova.compute.manager [None req-a0a4b1a6-503e-40bd-8dcf-7d9e29694b2e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Took 0.04 seconds to destroy the instance on the hypervisor. [ 998.610893] env[59518]: DEBUG oslo.service.loopingcall [None req-a0a4b1a6-503e-40bd-8dcf-7d9e29694b2e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 998.611145] env[59518]: DEBUG nova.compute.manager [-] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 998.611247] env[59518]: DEBUG nova.network.neutron [-] [instance: eead3c41-1a63-48f7-941e-24470658ed13] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 998.633756] env[59518]: DEBUG nova.network.neutron [-] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 998.635739] env[59518]: DEBUG oslo_concurrency.lockutils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 998.635955] env[59518]: DEBUG oslo_concurrency.lockutils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 998.637299] env[59518]: INFO nova.compute.claims [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 998.640366] env[59518]: INFO nova.compute.manager [-] [instance: eead3c41-1a63-48f7-941e-24470658ed13] Took 0.03 seconds to deallocate network for instance. [ 998.721542] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a0a4b1a6-503e-40bd-8dcf-7d9e29694b2e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "eead3c41-1a63-48f7-941e-24470658ed13" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.158s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 998.820395] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df94e2ae-54db-4dec-ae5b-c23c624ab977 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.829261] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b99be69f-08d6-4f3c-934c-865b9ea758e7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.859790] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bc61349-5f20-4f5a-9550-267283095b24 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.867521] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc130dec-128b-4bfc-bc3d-4b12425d4ef8 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 998.881818] env[59518]: DEBUG nova.compute.provider_tree [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 998.890415] env[59518]: DEBUG nova.scheduler.client.report [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 998.903728] env[59518]: DEBUG oslo_concurrency.lockutils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.267s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 998.903728] env[59518]: DEBUG nova.compute.manager [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 998.934315] env[59518]: DEBUG nova.compute.utils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 998.935741] env[59518]: DEBUG nova.compute.manager [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Allocating IP information in the background. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 998.935902] env[59518]: DEBUG nova.network.neutron [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] allocate_for_instance() {{(pid=59518) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 998.943956] env[59518]: DEBUG nova.compute.manager [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 998.978712] env[59518]: DEBUG nova.policy [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1df4b68c38084a3eb22fe1e1022cda14', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ce55d7c9be8148799a46914280718eb7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59518) authorize /opt/stack/nova/nova/policy.py:203}} [ 999.002011] env[59518]: DEBUG nova.compute.manager [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 999.022080] env[59518]: DEBUG nova.virt.hardware [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 999.022327] env[59518]: DEBUG nova.virt.hardware [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 999.022478] env[59518]: DEBUG nova.virt.hardware [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 999.022651] env[59518]: DEBUG nova.virt.hardware [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 999.022789] env[59518]: DEBUG nova.virt.hardware [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 999.022927] env[59518]: DEBUG nova.virt.hardware [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 999.023125] env[59518]: DEBUG nova.virt.hardware [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 999.023313] env[59518]: DEBUG nova.virt.hardware [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 999.023476] env[59518]: DEBUG nova.virt.hardware [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 999.023630] env[59518]: DEBUG nova.virt.hardware [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 999.023791] env[59518]: DEBUG nova.virt.hardware [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 999.024676] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4444b82a-90bc-4b21-a3ed-50b95d7c9e3f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.031969] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ce4a87b-52c9-4896-8a45-f33bdb6137fc {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.207646] env[59518]: DEBUG nova.network.neutron [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Successfully created port: 9666ae20-185a-4053-8929-f337e34f5252 {{(pid=59518) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 999.696171] env[59518]: DEBUG nova.network.neutron [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Successfully updated port: 9666ae20-185a-4053-8929-f337e34f5252 {{(pid=59518) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 999.711405] env[59518]: DEBUG oslo_concurrency.lockutils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Acquiring lock "refresh_cache-3981aa30-0515-4764-9aac-d0c99a48b064" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 999.711537] env[59518]: DEBUG oslo_concurrency.lockutils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Acquired lock "refresh_cache-3981aa30-0515-4764-9aac-d0c99a48b064" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 999.711677] env[59518]: DEBUG nova.network.neutron [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 999.744281] env[59518]: DEBUG nova.network.neutron [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 999.893622] env[59518]: DEBUG nova.network.neutron [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Updating instance_info_cache with network_info: [{"id": "9666ae20-185a-4053-8929-f337e34f5252", "address": "fa:16:3e:19:b8:af", "network": {"id": "f5b8b495-04ce-496f-83e7-b801df7d0c7c", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-717339231-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ce55d7c9be8148799a46914280718eb7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69076131-87ac-46dd-9d5d-8d1b4ea7dec6", "external-id": "nsx-vlan-transportzone-327", "segmentation_id": 327, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9666ae20-18", "ovs_interfaceid": "9666ae20-185a-4053-8929-f337e34f5252", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 999.904362] env[59518]: DEBUG oslo_concurrency.lockutils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Releasing lock "refresh_cache-3981aa30-0515-4764-9aac-d0c99a48b064" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 999.904631] env[59518]: DEBUG nova.compute.manager [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Instance network_info: |[{"id": "9666ae20-185a-4053-8929-f337e34f5252", "address": "fa:16:3e:19:b8:af", "network": {"id": "f5b8b495-04ce-496f-83e7-b801df7d0c7c", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-717339231-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ce55d7c9be8148799a46914280718eb7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69076131-87ac-46dd-9d5d-8d1b4ea7dec6", "external-id": "nsx-vlan-transportzone-327", "segmentation_id": 327, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9666ae20-18", "ovs_interfaceid": "9666ae20-185a-4053-8929-f337e34f5252", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 999.904984] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:19:b8:af', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '69076131-87ac-46dd-9d5d-8d1b4ea7dec6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9666ae20-185a-4053-8929-f337e34f5252', 'vif_model': 'vmxnet3'}] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 999.912705] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Creating folder: Project (ce55d7c9be8148799a46914280718eb7). Parent ref: group-v88807. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 999.913216] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c30244b8-f69d-4550-aaf9-973c48f8c914 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.925182] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Created folder: Project (ce55d7c9be8148799a46914280718eb7) in parent group-v88807. [ 999.925358] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Creating folder: Instances. Parent ref: group-v88859. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 999.925575] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-47fc1134-f445-472e-93ef-afc97adfec72 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.934922] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Created folder: Instances in parent group-v88859. [ 999.935126] env[59518]: DEBUG oslo.service.loopingcall [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 999.935295] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 999.935959] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0be7ded6-9705-4100-8c20-0f223222e059 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.960312] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 999.960312] env[59518]: value = "task-307993" [ 999.960312] env[59518]: _type = "Task" [ 999.960312] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 999.968068] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307993, 'name': CreateVM_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1000.469637] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307993, 'name': CreateVM_Task, 'duration_secs': 0.282625} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1000.469802] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1000.470491] env[59518]: DEBUG oslo_concurrency.lockutils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1000.470654] env[59518]: DEBUG oslo_concurrency.lockutils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1000.470993] env[59518]: DEBUG oslo_concurrency.lockutils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1000.471256] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-aa690917-36c1-4051-a8fe-0ae02106cf3d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.475429] env[59518]: DEBUG oslo_vmware.api [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Waiting for the task: (returnval){ [ 1000.475429] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52ebd7b6-ff92-a212-32d1-29e0c6ffaf85" [ 1000.475429] env[59518]: _type = "Task" [ 1000.475429] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1000.483765] env[59518]: DEBUG oslo_vmware.api [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52ebd7b6-ff92-a212-32d1-29e0c6ffaf85, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1000.489542] env[59518]: DEBUG nova.compute.manager [req-1eefb317-3684-4cce-bc04-705ba5005db8 req-a9d33674-0eed-44fd-bb51-d3d16b5b296f service nova] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Received event network-vif-plugged-9666ae20-185a-4053-8929-f337e34f5252 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1000.489730] env[59518]: DEBUG oslo_concurrency.lockutils [req-1eefb317-3684-4cce-bc04-705ba5005db8 req-a9d33674-0eed-44fd-bb51-d3d16b5b296f service nova] Acquiring lock "3981aa30-0515-4764-9aac-d0c99a48b064-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1000.489971] env[59518]: DEBUG oslo_concurrency.lockutils [req-1eefb317-3684-4cce-bc04-705ba5005db8 req-a9d33674-0eed-44fd-bb51-d3d16b5b296f service nova] Lock "3981aa30-0515-4764-9aac-d0c99a48b064-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1000.490072] env[59518]: DEBUG oslo_concurrency.lockutils [req-1eefb317-3684-4cce-bc04-705ba5005db8 req-a9d33674-0eed-44fd-bb51-d3d16b5b296f service nova] Lock "3981aa30-0515-4764-9aac-d0c99a48b064-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1000.490212] env[59518]: DEBUG nova.compute.manager [req-1eefb317-3684-4cce-bc04-705ba5005db8 req-a9d33674-0eed-44fd-bb51-d3d16b5b296f service nova] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] No waiting events found dispatching network-vif-plugged-9666ae20-185a-4053-8929-f337e34f5252 {{(pid=59518) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1000.490493] env[59518]: WARNING nova.compute.manager [req-1eefb317-3684-4cce-bc04-705ba5005db8 req-a9d33674-0eed-44fd-bb51-d3d16b5b296f service nova] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Received unexpected event network-vif-plugged-9666ae20-185a-4053-8929-f337e34f5252 for instance with vm_state building and task_state spawning. [ 1000.490549] env[59518]: DEBUG nova.compute.manager [req-1eefb317-3684-4cce-bc04-705ba5005db8 req-a9d33674-0eed-44fd-bb51-d3d16b5b296f service nova] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Received event network-changed-9666ae20-185a-4053-8929-f337e34f5252 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1000.490645] env[59518]: DEBUG nova.compute.manager [req-1eefb317-3684-4cce-bc04-705ba5005db8 req-a9d33674-0eed-44fd-bb51-d3d16b5b296f service nova] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Refreshing instance network info cache due to event network-changed-9666ae20-185a-4053-8929-f337e34f5252. {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 1000.490816] env[59518]: DEBUG oslo_concurrency.lockutils [req-1eefb317-3684-4cce-bc04-705ba5005db8 req-a9d33674-0eed-44fd-bb51-d3d16b5b296f service nova] Acquiring lock "refresh_cache-3981aa30-0515-4764-9aac-d0c99a48b064" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1000.490941] env[59518]: DEBUG oslo_concurrency.lockutils [req-1eefb317-3684-4cce-bc04-705ba5005db8 req-a9d33674-0eed-44fd-bb51-d3d16b5b296f service nova] Acquired lock "refresh_cache-3981aa30-0515-4764-9aac-d0c99a48b064" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1000.491111] env[59518]: DEBUG nova.network.neutron [req-1eefb317-3684-4cce-bc04-705ba5005db8 req-a9d33674-0eed-44fd-bb51-d3d16b5b296f service nova] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Refreshing network info cache for port 9666ae20-185a-4053-8929-f337e34f5252 {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1000.681459] env[59518]: DEBUG nova.network.neutron [req-1eefb317-3684-4cce-bc04-705ba5005db8 req-a9d33674-0eed-44fd-bb51-d3d16b5b296f service nova] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Updated VIF entry in instance network info cache for port 9666ae20-185a-4053-8929-f337e34f5252. {{(pid=59518) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1000.681823] env[59518]: DEBUG nova.network.neutron [req-1eefb317-3684-4cce-bc04-705ba5005db8 req-a9d33674-0eed-44fd-bb51-d3d16b5b296f service nova] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Updating instance_info_cache with network_info: [{"id": "9666ae20-185a-4053-8929-f337e34f5252", "address": "fa:16:3e:19:b8:af", "network": {"id": "f5b8b495-04ce-496f-83e7-b801df7d0c7c", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-717339231-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ce55d7c9be8148799a46914280718eb7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69076131-87ac-46dd-9d5d-8d1b4ea7dec6", "external-id": "nsx-vlan-transportzone-327", "segmentation_id": 327, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9666ae20-18", "ovs_interfaceid": "9666ae20-185a-4053-8929-f337e34f5252", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1000.691207] env[59518]: DEBUG oslo_concurrency.lockutils [req-1eefb317-3684-4cce-bc04-705ba5005db8 req-a9d33674-0eed-44fd-bb51-d3d16b5b296f service nova] Releasing lock "refresh_cache-3981aa30-0515-4764-9aac-d0c99a48b064" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1000.985341] env[59518]: DEBUG oslo_concurrency.lockutils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1000.985612] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1000.985723] env[59518]: DEBUG oslo_concurrency.lockutils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1002.514024] env[59518]: DEBUG nova.compute.manager [req-36e3349a-9304-485f-83a1-65e13690e8c2 req-45566d70-b8cd-454b-a8ea-5d42b54ccb2f service nova] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Received event network-vif-deleted-6b2feb20-91d6-4694-972e-e6de127559fb {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1010.449697] env[59518]: DEBUG nova.compute.manager [req-15ffbd83-f4a0-4684-bcb4-5e33e566a9aa req-673156ed-c55e-4bd4-b8d1-d63da9d2556f service nova] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Received event network-vif-deleted-9666ae20-185a-4053-8929-f337e34f5252 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1036.795466] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1040.444603] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1040.463016] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1041.448029] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1041.448365] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Starting heal instance info cache {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1041.448365] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Rebuilding the list of instances to heal {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1041.464128] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1041.464343] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1041.464479] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1041.464602] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1041.464719] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1041.464835] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1041.464949] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Didn't find any instances for network info cache update. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1042.447647] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1042.447905] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1042.448022] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59518) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1043.449200] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1043.449573] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1044.444073] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1045.447704] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1045.457796] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1045.458042] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1045.458236] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1045.458417] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59518) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1045.459845] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6859e11d-45d9-4ff5-9822-a4b49124942f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.469901] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6cc26f8-beda-4837-97d5-49963496f3e7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.484599] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7528bec-eb04-42fd-8147-000214b3c2d2 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.490907] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2677aa1c-7004-4d24-8690-8fe34d689178 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.521729] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181755MB free_disk=174GB free_vcpus=48 pci_devices=None {{(pid=59518) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1045.521884] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1045.522114] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1045.573302] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance a85abea1-8e8d-4007-803d-e36fff55e587 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1045.573486] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 04a58b0b-dfd8-4227-9c10-a69225fa5a53 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1045.573624] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 282b61db-76cd-44c3-b500-7a465e903c97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1045.573780] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 3e59b5d7-978d-405a-b68a-47ee03b9a713 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1045.573960] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance af4b8dd9-a05d-427e-a147-76c7cfec5862 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1045.574026] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance ae88d565-bbf5-4c29-aee9-364c23086de5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1045.584033] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 39cfe606-43a0-4a52-8ec1-433baf7a3aec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1045.584227] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1045.584371] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1045.668023] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66955769-1d78-4c37-983c-56fe699278bf {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.675006] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5eb9a0c3-947f-40ab-ab9e-104f0cdab789 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.704728] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36c02dd3-fece-4bda-9aaf-9d95d17c6d95 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.710922] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca140d41-058e-443a-9b81-a6a96cb7d0ad {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.723453] env[59518]: DEBUG nova.compute.provider_tree [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1045.732243] env[59518]: DEBUG nova.scheduler.client.report [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1045.744439] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59518) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1045.744606] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.223s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1047.210589] env[59518]: WARNING oslo_vmware.rw_handles [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1047.210589] env[59518]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1047.210589] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1047.210589] env[59518]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1047.210589] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1047.210589] env[59518]: ERROR oslo_vmware.rw_handles response.begin() [ 1047.210589] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1047.210589] env[59518]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1047.210589] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1047.210589] env[59518]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1047.210589] env[59518]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1047.210589] env[59518]: ERROR oslo_vmware.rw_handles [ 1047.211774] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Downloaded image file data e70539a9-144d-4900-807e-914ae0cc8539 to vmware_temp/8b5ae967-99a9-42f7-b805-57ce670a3591/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1047.213851] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Caching image {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1047.214261] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Copying Virtual Disk [datastore1] vmware_temp/8b5ae967-99a9-42f7-b805-57ce670a3591/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk to [datastore1] vmware_temp/8b5ae967-99a9-42f7-b805-57ce670a3591/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk {{(pid=59518) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1047.214677] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6d3496d9-7a74-4992-aaa2-3a4057a00942 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.221996] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Waiting for the task: (returnval){ [ 1047.221996] env[59518]: value = "task-307994" [ 1047.221996] env[59518]: _type = "Task" [ 1047.221996] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1047.229689] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Task: {'id': task-307994, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1047.732313] env[59518]: DEBUG oslo_vmware.exceptions [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Fault InvalidArgument not matched. {{(pid=59518) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1047.732599] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1047.733135] env[59518]: ERROR nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1047.733135] env[59518]: Faults: ['InvalidArgument'] [ 1047.733135] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Traceback (most recent call last): [ 1047.733135] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1047.733135] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] yield resources [ 1047.733135] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1047.733135] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] self.driver.spawn(context, instance, image_meta, [ 1047.733135] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1047.733135] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1047.733135] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1047.733135] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] self._fetch_image_if_missing(context, vi) [ 1047.733135] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1047.733452] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] image_cache(vi, tmp_image_ds_loc) [ 1047.733452] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1047.733452] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] vm_util.copy_virtual_disk( [ 1047.733452] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1047.733452] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] session._wait_for_task(vmdk_copy_task) [ 1047.733452] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1047.733452] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] return self.wait_for_task(task_ref) [ 1047.733452] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1047.733452] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] return evt.wait() [ 1047.733452] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1047.733452] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] result = hub.switch() [ 1047.733452] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1047.733452] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] return self.greenlet.switch() [ 1047.733763] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1047.733763] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] self.f(*self.args, **self.kw) [ 1047.733763] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1047.733763] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] raise exceptions.translate_fault(task_info.error) [ 1047.733763] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1047.733763] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Faults: ['InvalidArgument'] [ 1047.733763] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] [ 1047.733763] env[59518]: INFO nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Terminating instance [ 1047.735422] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1047.735422] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1047.735422] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-794e863c-d77b-4dd8-8957-66e3d32ed5ac {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.737533] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1047.737742] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1047.738456] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb40286d-51ce-4f31-8c06-e24334f76fcf {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.744847] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1047.745041] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f32caaf6-68f6-4601-a3ab-927ebe591716 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.747172] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1047.747330] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1047.748229] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7f2709d4-a2cb-4e92-bf0a-dd3eaed282e0 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.752553] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Waiting for the task: (returnval){ [ 1047.752553] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]526522cc-ed0c-f888-3c30-d0f057c38006" [ 1047.752553] env[59518]: _type = "Task" [ 1047.752553] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1047.759321] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]526522cc-ed0c-f888-3c30-d0f057c38006, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1047.809492] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1047.809681] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1047.809853] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Deleting the datastore file [datastore1] a85abea1-8e8d-4007-803d-e36fff55e587 {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1047.810108] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-aacd8d82-85fc-4b75-a0a6-ee1d8acbd650 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.816223] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Waiting for the task: (returnval){ [ 1047.816223] env[59518]: value = "task-307996" [ 1047.816223] env[59518]: _type = "Task" [ 1047.816223] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1047.823515] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Task: {'id': task-307996, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1048.262268] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1048.262617] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Creating directory with path [datastore1] vmware_temp/9aec810f-b0e0-472c-bf6a-761aef05f237/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1048.262721] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a477f65c-666a-4bb1-8857-563c9656983c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.273819] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Created directory with path [datastore1] vmware_temp/9aec810f-b0e0-472c-bf6a-761aef05f237/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1048.274034] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Fetch image to [datastore1] vmware_temp/9aec810f-b0e0-472c-bf6a-761aef05f237/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1048.274252] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/9aec810f-b0e0-472c-bf6a-761aef05f237/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1048.274988] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7096a698-0e68-4bdd-9a5d-09d5c7473d55 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.281199] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a75b4721-60eb-46b4-ac81-0c48aba3ef9d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.289775] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e858e0e2-26f6-4d33-b99e-57e2e4cdc29e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.322695] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee37bc1e-4aa5-4f3b-8907-575e3da6368d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.329170] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Task: {'id': task-307996, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069937} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1048.330557] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1048.330786] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1048.331008] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1048.331226] env[59518]: INFO nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1048.333121] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-db9d132c-5189-4c42-bddb-884c6f1398d5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.334922] env[59518]: DEBUG nova.compute.claims [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1048.335119] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1048.335368] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1048.357001] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1048.404575] env[59518]: DEBUG oslo_vmware.rw_handles [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9aec810f-b0e0-472c-bf6a-761aef05f237/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1048.458481] env[59518]: DEBUG oslo_vmware.rw_handles [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Completed reading data from the image iterator. {{(pid=59518) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1048.458655] env[59518]: DEBUG oslo_vmware.rw_handles [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9aec810f-b0e0-472c-bf6a-761aef05f237/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1048.508908] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd095a12-3c68-45f5-829f-c065f24782bd {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.516357] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d73a68de-a300-4647-858e-6ebcabb29502 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.545834] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36ffee08-3025-42f1-90b2-0ff52085ddd2 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.552321] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc6fac6f-bdfc-4053-a67e-6e21da09428b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.565506] env[59518]: DEBUG nova.compute.provider_tree [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1048.573388] env[59518]: DEBUG nova.scheduler.client.report [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1048.587270] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.252s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1048.587765] env[59518]: ERROR nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1048.587765] env[59518]: Faults: ['InvalidArgument'] [ 1048.587765] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Traceback (most recent call last): [ 1048.587765] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1048.587765] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] self.driver.spawn(context, instance, image_meta, [ 1048.587765] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1048.587765] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1048.587765] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1048.587765] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] self._fetch_image_if_missing(context, vi) [ 1048.587765] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1048.587765] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] image_cache(vi, tmp_image_ds_loc) [ 1048.587765] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1048.588124] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] vm_util.copy_virtual_disk( [ 1048.588124] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1048.588124] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] session._wait_for_task(vmdk_copy_task) [ 1048.588124] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1048.588124] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] return self.wait_for_task(task_ref) [ 1048.588124] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1048.588124] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] return evt.wait() [ 1048.588124] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1048.588124] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] result = hub.switch() [ 1048.588124] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1048.588124] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] return self.greenlet.switch() [ 1048.588124] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1048.588124] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] self.f(*self.args, **self.kw) [ 1048.588489] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1048.588489] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] raise exceptions.translate_fault(task_info.error) [ 1048.588489] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1048.588489] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Faults: ['InvalidArgument'] [ 1048.588489] env[59518]: ERROR nova.compute.manager [instance: a85abea1-8e8d-4007-803d-e36fff55e587] [ 1048.588489] env[59518]: DEBUG nova.compute.utils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] VimFaultException {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1048.589847] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Build of instance a85abea1-8e8d-4007-803d-e36fff55e587 was re-scheduled: A specified parameter was not correct: fileType [ 1048.589847] env[59518]: Faults: ['InvalidArgument'] {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1048.590210] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1048.590372] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1048.590530] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1048.590685] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1048.816669] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1048.829670] env[59518]: INFO nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Took 0.24 seconds to deallocate network for instance. [ 1048.912997] env[59518]: INFO nova.scheduler.client.report [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Deleted allocations for instance a85abea1-8e8d-4007-803d-e36fff55e587 [ 1048.927940] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "a85abea1-8e8d-4007-803d-e36fff55e587" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 431.009s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1048.929016] env[59518]: DEBUG oslo_concurrency.lockutils [None req-5b6a5a77-a1fc-4f8d-a1c6-74c4a91c9b5e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "a85abea1-8e8d-4007-803d-e36fff55e587" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 228.843s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1048.929220] env[59518]: DEBUG oslo_concurrency.lockutils [None req-5b6a5a77-a1fc-4f8d-a1c6-74c4a91c9b5e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "a85abea1-8e8d-4007-803d-e36fff55e587-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1048.929460] env[59518]: DEBUG oslo_concurrency.lockutils [None req-5b6a5a77-a1fc-4f8d-a1c6-74c4a91c9b5e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "a85abea1-8e8d-4007-803d-e36fff55e587-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1048.929634] env[59518]: DEBUG oslo_concurrency.lockutils [None req-5b6a5a77-a1fc-4f8d-a1c6-74c4a91c9b5e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "a85abea1-8e8d-4007-803d-e36fff55e587-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1048.931548] env[59518]: INFO nova.compute.manager [None req-5b6a5a77-a1fc-4f8d-a1c6-74c4a91c9b5e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Terminating instance [ 1048.933280] env[59518]: DEBUG nova.compute.manager [None req-5b6a5a77-a1fc-4f8d-a1c6-74c4a91c9b5e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1048.933464] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-5b6a5a77-a1fc-4f8d-a1c6-74c4a91c9b5e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1048.933901] env[59518]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b4a3fbe0-d489-477a-8e21-5f533a86c271 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.943296] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6483aee3-208e-4e65-821c-793e6956eb8d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.956728] env[59518]: DEBUG nova.compute.manager [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1048.975421] env[59518]: WARNING nova.virt.vmwareapi.vmops [None req-5b6a5a77-a1fc-4f8d-a1c6-74c4a91c9b5e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a85abea1-8e8d-4007-803d-e36fff55e587 could not be found. [ 1048.975592] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-5b6a5a77-a1fc-4f8d-a1c6-74c4a91c9b5e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1048.975759] env[59518]: INFO nova.compute.manager [None req-5b6a5a77-a1fc-4f8d-a1c6-74c4a91c9b5e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1048.975981] env[59518]: DEBUG oslo.service.loopingcall [None req-5b6a5a77-a1fc-4f8d-a1c6-74c4a91c9b5e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1048.976202] env[59518]: DEBUG nova.compute.manager [-] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1048.976294] env[59518]: DEBUG nova.network.neutron [-] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1049.001660] env[59518]: DEBUG nova.network.neutron [-] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1049.006486] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1049.006705] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1049.008434] env[59518]: INFO nova.compute.claims [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1049.011570] env[59518]: INFO nova.compute.manager [-] [instance: a85abea1-8e8d-4007-803d-e36fff55e587] Took 0.04 seconds to deallocate network for instance. [ 1049.104937] env[59518]: DEBUG oslo_concurrency.lockutils [None req-5b6a5a77-a1fc-4f8d-a1c6-74c4a91c9b5e tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "a85abea1-8e8d-4007-803d-e36fff55e587" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.176s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1049.147942] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97f5a719-f742-4b08-bb8c-88334d0f271f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1049.155551] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f0a2300-7566-404b-b2cf-607ea1ed6de2 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1049.189242] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d36f5e1-8f8e-4779-880e-b66b0d008af4 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1049.196351] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35486bf4-bb51-42c5-9935-de9145e4a6df {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1049.209657] env[59518]: DEBUG nova.compute.provider_tree [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1049.219110] env[59518]: DEBUG nova.scheduler.client.report [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1049.233300] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.226s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1049.233774] env[59518]: DEBUG nova.compute.manager [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1049.266032] env[59518]: DEBUG nova.compute.utils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1049.267783] env[59518]: DEBUG nova.compute.manager [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Allocating IP information in the background. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1049.267952] env[59518]: DEBUG nova.network.neutron [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] allocate_for_instance() {{(pid=59518) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1049.275891] env[59518]: DEBUG nova.compute.manager [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1049.334920] env[59518]: DEBUG nova.compute.manager [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1049.355750] env[59518]: DEBUG nova.virt.hardware [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1049.355982] env[59518]: DEBUG nova.virt.hardware [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1049.356148] env[59518]: DEBUG nova.virt.hardware [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1049.356329] env[59518]: DEBUG nova.virt.hardware [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1049.356472] env[59518]: DEBUG nova.virt.hardware [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1049.356611] env[59518]: DEBUG nova.virt.hardware [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1049.356808] env[59518]: DEBUG nova.virt.hardware [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1049.356956] env[59518]: DEBUG nova.virt.hardware [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1049.357111] env[59518]: DEBUG nova.virt.hardware [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1049.357318] env[59518]: DEBUG nova.virt.hardware [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1049.357499] env[59518]: DEBUG nova.virt.hardware [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1049.358513] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2aca9964-6338-4815-9f7d-ccac7741d4da {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1049.365986] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5948d5fd-83b6-4f9e-aa98-c1ba55a1bbb5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1049.372172] env[59518]: DEBUG nova.policy [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e76e6170983343eb98ce9b38f7160f5e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'de52be7e32e5496f8ee12e4750b3644d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59518) authorize /opt/stack/nova/nova/policy.py:203}} [ 1049.805814] env[59518]: DEBUG nova.network.neutron [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Successfully created port: 1ad5bfc1-f374-42b5-8ca7-d91415bebbdd {{(pid=59518) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1050.761165] env[59518]: DEBUG nova.network.neutron [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Successfully updated port: 1ad5bfc1-f374-42b5-8ca7-d91415bebbdd {{(pid=59518) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1050.773883] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquiring lock "refresh_cache-39cfe606-43a0-4a52-8ec1-433baf7a3aec" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1050.774033] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquired lock "refresh_cache-39cfe606-43a0-4a52-8ec1-433baf7a3aec" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1050.774182] env[59518]: DEBUG nova.network.neutron [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1050.804678] env[59518]: DEBUG nova.network.neutron [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1050.860502] env[59518]: DEBUG nova.compute.manager [req-0386246d-c4bc-400e-a8e2-e048a3794f3a req-da751130-51af-4661-8256-2158dad5b592 service nova] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Received event network-vif-plugged-1ad5bfc1-f374-42b5-8ca7-d91415bebbdd {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1050.860811] env[59518]: DEBUG oslo_concurrency.lockutils [req-0386246d-c4bc-400e-a8e2-e048a3794f3a req-da751130-51af-4661-8256-2158dad5b592 service nova] Acquiring lock "39cfe606-43a0-4a52-8ec1-433baf7a3aec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1050.861144] env[59518]: DEBUG oslo_concurrency.lockutils [req-0386246d-c4bc-400e-a8e2-e048a3794f3a req-da751130-51af-4661-8256-2158dad5b592 service nova] Lock "39cfe606-43a0-4a52-8ec1-433baf7a3aec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1050.861324] env[59518]: DEBUG oslo_concurrency.lockutils [req-0386246d-c4bc-400e-a8e2-e048a3794f3a req-da751130-51af-4661-8256-2158dad5b592 service nova] Lock "39cfe606-43a0-4a52-8ec1-433baf7a3aec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1050.861479] env[59518]: DEBUG nova.compute.manager [req-0386246d-c4bc-400e-a8e2-e048a3794f3a req-da751130-51af-4661-8256-2158dad5b592 service nova] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] No waiting events found dispatching network-vif-plugged-1ad5bfc1-f374-42b5-8ca7-d91415bebbdd {{(pid=59518) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1050.861629] env[59518]: WARNING nova.compute.manager [req-0386246d-c4bc-400e-a8e2-e048a3794f3a req-da751130-51af-4661-8256-2158dad5b592 service nova] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Received unexpected event network-vif-plugged-1ad5bfc1-f374-42b5-8ca7-d91415bebbdd for instance with vm_state building and task_state spawning. [ 1050.861772] env[59518]: DEBUG nova.compute.manager [req-0386246d-c4bc-400e-a8e2-e048a3794f3a req-da751130-51af-4661-8256-2158dad5b592 service nova] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Received event network-changed-1ad5bfc1-f374-42b5-8ca7-d91415bebbdd {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1050.861911] env[59518]: DEBUG nova.compute.manager [req-0386246d-c4bc-400e-a8e2-e048a3794f3a req-da751130-51af-4661-8256-2158dad5b592 service nova] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Refreshing instance network info cache due to event network-changed-1ad5bfc1-f374-42b5-8ca7-d91415bebbdd. {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 1050.862484] env[59518]: DEBUG oslo_concurrency.lockutils [req-0386246d-c4bc-400e-a8e2-e048a3794f3a req-da751130-51af-4661-8256-2158dad5b592 service nova] Acquiring lock "refresh_cache-39cfe606-43a0-4a52-8ec1-433baf7a3aec" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1050.944470] env[59518]: DEBUG nova.network.neutron [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Updating instance_info_cache with network_info: [{"id": "1ad5bfc1-f374-42b5-8ca7-d91415bebbdd", "address": "fa:16:3e:fd:c5:fd", "network": {"id": "2a11bd53-b61c-45e8-bbcb-745d685ad1b2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1418418357-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "de52be7e32e5496f8ee12e4750b3644d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae4e3171-21cd-4094-b6cf-81bf366c75bd", "external-id": "nsx-vlan-transportzone-193", "segmentation_id": 193, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ad5bfc1-f3", "ovs_interfaceid": "1ad5bfc1-f374-42b5-8ca7-d91415bebbdd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1050.957961] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Releasing lock "refresh_cache-39cfe606-43a0-4a52-8ec1-433baf7a3aec" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1050.958326] env[59518]: DEBUG nova.compute.manager [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Instance network_info: |[{"id": "1ad5bfc1-f374-42b5-8ca7-d91415bebbdd", "address": "fa:16:3e:fd:c5:fd", "network": {"id": "2a11bd53-b61c-45e8-bbcb-745d685ad1b2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1418418357-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "de52be7e32e5496f8ee12e4750b3644d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae4e3171-21cd-4094-b6cf-81bf366c75bd", "external-id": "nsx-vlan-transportzone-193", "segmentation_id": 193, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ad5bfc1-f3", "ovs_interfaceid": "1ad5bfc1-f374-42b5-8ca7-d91415bebbdd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1050.958614] env[59518]: DEBUG oslo_concurrency.lockutils [req-0386246d-c4bc-400e-a8e2-e048a3794f3a req-da751130-51af-4661-8256-2158dad5b592 service nova] Acquired lock "refresh_cache-39cfe606-43a0-4a52-8ec1-433baf7a3aec" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1050.958787] env[59518]: DEBUG nova.network.neutron [req-0386246d-c4bc-400e-a8e2-e048a3794f3a req-da751130-51af-4661-8256-2158dad5b592 service nova] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Refreshing network info cache for port 1ad5bfc1-f374-42b5-8ca7-d91415bebbdd {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1050.959964] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:fd:c5:fd', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ae4e3171-21cd-4094-b6cf-81bf366c75bd', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1ad5bfc1-f374-42b5-8ca7-d91415bebbdd', 'vif_model': 'vmxnet3'}] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1050.967825] env[59518]: DEBUG oslo.service.loopingcall [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1050.968650] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1050.971007] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-04e18f58-687e-415d-865e-d27614cc2e86 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1050.991108] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1050.991108] env[59518]: value = "task-307997" [ 1050.991108] env[59518]: _type = "Task" [ 1050.991108] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1050.998715] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307997, 'name': CreateVM_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1051.277258] env[59518]: DEBUG nova.network.neutron [req-0386246d-c4bc-400e-a8e2-e048a3794f3a req-da751130-51af-4661-8256-2158dad5b592 service nova] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Updated VIF entry in instance network info cache for port 1ad5bfc1-f374-42b5-8ca7-d91415bebbdd. {{(pid=59518) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1051.277601] env[59518]: DEBUG nova.network.neutron [req-0386246d-c4bc-400e-a8e2-e048a3794f3a req-da751130-51af-4661-8256-2158dad5b592 service nova] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Updating instance_info_cache with network_info: [{"id": "1ad5bfc1-f374-42b5-8ca7-d91415bebbdd", "address": "fa:16:3e:fd:c5:fd", "network": {"id": "2a11bd53-b61c-45e8-bbcb-745d685ad1b2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1418418357-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "de52be7e32e5496f8ee12e4750b3644d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae4e3171-21cd-4094-b6cf-81bf366c75bd", "external-id": "nsx-vlan-transportzone-193", "segmentation_id": 193, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ad5bfc1-f3", "ovs_interfaceid": "1ad5bfc1-f374-42b5-8ca7-d91415bebbdd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1051.287474] env[59518]: DEBUG oslo_concurrency.lockutils [req-0386246d-c4bc-400e-a8e2-e048a3794f3a req-da751130-51af-4661-8256-2158dad5b592 service nova] Releasing lock "refresh_cache-39cfe606-43a0-4a52-8ec1-433baf7a3aec" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1051.500665] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-307997, 'name': CreateVM_Task, 'duration_secs': 0.462775} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1051.500832] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1051.501761] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1051.501841] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1051.502174] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1051.502428] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-caee6ab1-1795-4b8c-a603-ad273ed6325e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1051.507063] env[59518]: DEBUG oslo_vmware.api [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Waiting for the task: (returnval){ [ 1051.507063] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52110f01-1309-81f3-03a1-ba358aef6d88" [ 1051.507063] env[59518]: _type = "Task" [ 1051.507063] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1051.514546] env[59518]: DEBUG oslo_vmware.api [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52110f01-1309-81f3-03a1-ba358aef6d88, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1052.017125] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1052.017476] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1052.017567] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1053.427946] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Acquiring lock "468e2dc5-6a66-401d-b6cd-06bb94cea0ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1053.428271] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Lock "468e2dc5-6a66-401d-b6cd-06bb94cea0ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1097.654040] env[59518]: WARNING oslo_vmware.rw_handles [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1097.654040] env[59518]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1097.654040] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1097.654040] env[59518]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1097.654040] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1097.654040] env[59518]: ERROR oslo_vmware.rw_handles response.begin() [ 1097.654040] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1097.654040] env[59518]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1097.654040] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1097.654040] env[59518]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1097.654040] env[59518]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1097.654040] env[59518]: ERROR oslo_vmware.rw_handles [ 1097.654717] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Downloaded image file data e70539a9-144d-4900-807e-914ae0cc8539 to vmware_temp/9aec810f-b0e0-472c-bf6a-761aef05f237/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1097.656286] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Caching image {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1097.656519] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Copying Virtual Disk [datastore1] vmware_temp/9aec810f-b0e0-472c-bf6a-761aef05f237/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk to [datastore1] vmware_temp/9aec810f-b0e0-472c-bf6a-761aef05f237/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk {{(pid=59518) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1097.656809] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-598a4f96-4d34-4a20-95cb-e727c8cee2c4 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.665105] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Waiting for the task: (returnval){ [ 1097.665105] env[59518]: value = "task-307998" [ 1097.665105] env[59518]: _type = "Task" [ 1097.665105] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1097.672753] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Task: {'id': task-307998, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1097.745413] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1098.174250] env[59518]: DEBUG oslo_vmware.exceptions [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Fault InvalidArgument not matched. {{(pid=59518) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1098.174507] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1098.175068] env[59518]: ERROR nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1098.175068] env[59518]: Faults: ['InvalidArgument'] [ 1098.175068] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Traceback (most recent call last): [ 1098.175068] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1098.175068] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] yield resources [ 1098.175068] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1098.175068] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] self.driver.spawn(context, instance, image_meta, [ 1098.175068] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1098.175068] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1098.175068] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1098.175068] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] self._fetch_image_if_missing(context, vi) [ 1098.175068] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1098.175397] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] image_cache(vi, tmp_image_ds_loc) [ 1098.175397] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1098.175397] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] vm_util.copy_virtual_disk( [ 1098.175397] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1098.175397] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] session._wait_for_task(vmdk_copy_task) [ 1098.175397] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1098.175397] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] return self.wait_for_task(task_ref) [ 1098.175397] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1098.175397] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] return evt.wait() [ 1098.175397] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1098.175397] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] result = hub.switch() [ 1098.175397] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1098.175397] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] return self.greenlet.switch() [ 1098.175728] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1098.175728] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] self.f(*self.args, **self.kw) [ 1098.175728] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1098.175728] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] raise exceptions.translate_fault(task_info.error) [ 1098.175728] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1098.175728] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Faults: ['InvalidArgument'] [ 1098.175728] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] [ 1098.175728] env[59518]: INFO nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Terminating instance [ 1098.176992] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1098.177188] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1098.177420] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d8b436fc-1085-48e3-800a-92a66a49bc3a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.179643] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1098.179829] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1098.180538] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f532d439-5aab-4fb4-9963-eb4b8a041b4b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.186680] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1098.186877] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-073588c1-bbbc-4b3d-9d04-30ab6e929508 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.188870] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1098.189040] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1098.189922] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-df306b16-c303-4ca7-9b66-3dd03a9239c9 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.194506] env[59518]: DEBUG oslo_vmware.api [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Waiting for the task: (returnval){ [ 1098.194506] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52977b22-adf7-4dd0-3a02-3f36b90b2b9a" [ 1098.194506] env[59518]: _type = "Task" [ 1098.194506] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1098.205215] env[59518]: DEBUG oslo_vmware.api [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52977b22-adf7-4dd0-3a02-3f36b90b2b9a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1098.251515] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1098.251735] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1098.251882] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Deleting the datastore file [datastore1] 04a58b0b-dfd8-4227-9c10-a69225fa5a53 {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1098.252187] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-30ca608b-2174-495e-9c1d-315721c2458d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.258037] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Waiting for the task: (returnval){ [ 1098.258037] env[59518]: value = "task-308000" [ 1098.258037] env[59518]: _type = "Task" [ 1098.258037] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1098.265523] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Task: {'id': task-308000, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1098.704921] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1098.705279] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Creating directory with path [datastore1] vmware_temp/f74312a4-bd7c-46e7-b8bf-8fc7f633482c/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1098.705413] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6ae8d55e-1d0a-4567-9bd1-b8b93cf471fe {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.715715] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Created directory with path [datastore1] vmware_temp/f74312a4-bd7c-46e7-b8bf-8fc7f633482c/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1098.715917] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Fetch image to [datastore1] vmware_temp/f74312a4-bd7c-46e7-b8bf-8fc7f633482c/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1098.716099] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/f74312a4-bd7c-46e7-b8bf-8fc7f633482c/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1098.716755] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7221719-25df-4db3-b65b-16dd46891335 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.722950] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dfe5483-c9e6-43ef-b8ee-8d0cd1b44fc4 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.731702] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8462c2a-10bc-4d14-8e1d-a771fe883ab7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.764649] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-456c8fe0-3dd4-4a69-bb37-dab35214c6f2 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.770986] env[59518]: DEBUG oslo_vmware.api [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Task: {'id': task-308000, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077767} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1098.772337] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1098.772515] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1098.772676] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1098.772838] env[59518]: INFO nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1098.774782] env[59518]: DEBUG nova.compute.claims [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1098.774935] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1098.775132] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1098.777445] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8ea74ae9-fb84-42a9-92ba-41a9c96b1fb5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.796866] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1098.841512] env[59518]: DEBUG oslo_vmware.rw_handles [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f74312a4-bd7c-46e7-b8bf-8fc7f633482c/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1098.895469] env[59518]: DEBUG oslo_vmware.rw_handles [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Completed reading data from the image iterator. {{(pid=59518) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1098.895642] env[59518]: DEBUG oslo_vmware.rw_handles [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f74312a4-bd7c-46e7-b8bf-8fc7f633482c/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1098.943156] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ce61ea0-caec-4ac1-a91a-5deafa6ee5e8 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.950777] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ac30a7a-47c4-47d5-ac45-108aa6d1cf5c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.979263] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e7d37aa-fb51-4f2d-9fe5-235c2efe78f3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.986047] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0d04d24-b5f3-4265-bee1-a8d2099d014f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.998633] env[59518]: DEBUG nova.compute.provider_tree [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1099.007058] env[59518]: DEBUG nova.scheduler.client.report [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1099.020690] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.245s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1099.021269] env[59518]: ERROR nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1099.021269] env[59518]: Faults: ['InvalidArgument'] [ 1099.021269] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Traceback (most recent call last): [ 1099.021269] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1099.021269] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] self.driver.spawn(context, instance, image_meta, [ 1099.021269] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1099.021269] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1099.021269] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1099.021269] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] self._fetch_image_if_missing(context, vi) [ 1099.021269] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1099.021269] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] image_cache(vi, tmp_image_ds_loc) [ 1099.021269] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1099.021587] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] vm_util.copy_virtual_disk( [ 1099.021587] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1099.021587] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] session._wait_for_task(vmdk_copy_task) [ 1099.021587] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1099.021587] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] return self.wait_for_task(task_ref) [ 1099.021587] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1099.021587] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] return evt.wait() [ 1099.021587] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1099.021587] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] result = hub.switch() [ 1099.021587] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1099.021587] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] return self.greenlet.switch() [ 1099.021587] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1099.021587] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] self.f(*self.args, **self.kw) [ 1099.021883] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1099.021883] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] raise exceptions.translate_fault(task_info.error) [ 1099.021883] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1099.021883] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Faults: ['InvalidArgument'] [ 1099.021883] env[59518]: ERROR nova.compute.manager [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] [ 1099.022005] env[59518]: DEBUG nova.compute.utils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] VimFaultException {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1099.023151] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Build of instance 04a58b0b-dfd8-4227-9c10-a69225fa5a53 was re-scheduled: A specified parameter was not correct: fileType [ 1099.023151] env[59518]: Faults: ['InvalidArgument'] {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1099.023509] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1099.023672] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1099.023833] env[59518]: DEBUG nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1099.023986] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1099.473025] env[59518]: DEBUG nova.network.neutron [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1099.488316] env[59518]: INFO nova.compute.manager [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Took 0.46 seconds to deallocate network for instance. [ 1099.571794] env[59518]: INFO nova.scheduler.client.report [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Deleted allocations for instance 04a58b0b-dfd8-4227-9c10-a69225fa5a53 [ 1099.588331] env[59518]: DEBUG oslo_concurrency.lockutils [None req-99168695-b946-4e0d-a21f-9ffd5eb83b5f tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "04a58b0b-dfd8-4227-9c10-a69225fa5a53" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 481.590s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1099.589399] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ace62319-0f11-46ef-a2c8-43222a4d7576 tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "04a58b0b-dfd8-4227-9c10-a69225fa5a53" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 279.563s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1099.589928] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ace62319-0f11-46ef-a2c8-43222a4d7576 tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Acquiring lock "04a58b0b-dfd8-4227-9c10-a69225fa5a53-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1099.589928] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ace62319-0f11-46ef-a2c8-43222a4d7576 tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "04a58b0b-dfd8-4227-9c10-a69225fa5a53-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1099.589928] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ace62319-0f11-46ef-a2c8-43222a4d7576 tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "04a58b0b-dfd8-4227-9c10-a69225fa5a53-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1099.591780] env[59518]: INFO nova.compute.manager [None req-ace62319-0f11-46ef-a2c8-43222a4d7576 tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Terminating instance [ 1099.593455] env[59518]: DEBUG nova.compute.manager [None req-ace62319-0f11-46ef-a2c8-43222a4d7576 tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1099.593639] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-ace62319-0f11-46ef-a2c8-43222a4d7576 tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1099.594067] env[59518]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a3e5331f-bba3-49e3-8514-b24384ff9892 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1099.604568] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acfc6841-97c8-4c7e-ad20-5ca49487c627 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1099.615316] env[59518]: DEBUG nova.compute.manager [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1099.635971] env[59518]: WARNING nova.virt.vmwareapi.vmops [None req-ace62319-0f11-46ef-a2c8-43222a4d7576 tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 04a58b0b-dfd8-4227-9c10-a69225fa5a53 could not be found. [ 1099.636216] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-ace62319-0f11-46ef-a2c8-43222a4d7576 tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1099.636405] env[59518]: INFO nova.compute.manager [None req-ace62319-0f11-46ef-a2c8-43222a4d7576 tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1099.636968] env[59518]: DEBUG oslo.service.loopingcall [None req-ace62319-0f11-46ef-a2c8-43222a4d7576 tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1099.637209] env[59518]: DEBUG nova.compute.manager [-] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1099.637305] env[59518]: DEBUG nova.network.neutron [-] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1099.658191] env[59518]: DEBUG nova.network.neutron [-] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1099.662381] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1099.662594] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1099.664299] env[59518]: INFO nova.compute.claims [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1099.667199] env[59518]: INFO nova.compute.manager [-] [instance: 04a58b0b-dfd8-4227-9c10-a69225fa5a53] Took 0.03 seconds to deallocate network for instance. [ 1099.745883] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ace62319-0f11-46ef-a2c8-43222a4d7576 tempest-ListServersNegativeTestJSON-686373666 tempest-ListServersNegativeTestJSON-686373666-project-member] Lock "04a58b0b-dfd8-4227-9c10-a69225fa5a53" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.156s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1099.790843] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce570e8b-51ad-47c1-b482-76f22ec3bee1 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1099.798938] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47685c78-346f-43fd-bdd9-df9dd6f4c4a5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1099.829338] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-446fb1aa-40d8-4cb8-9b08-63a49a8961c1 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1099.836847] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64301afe-a6ee-4da0-9baf-de8b80908b09 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1099.850665] env[59518]: DEBUG nova.compute.provider_tree [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1099.860364] env[59518]: DEBUG nova.scheduler.client.report [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1099.873658] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1099.874340] env[59518]: DEBUG nova.compute.manager [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1099.916018] env[59518]: DEBUG nova.compute.utils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1099.917437] env[59518]: DEBUG nova.compute.manager [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Allocating IP information in the background. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1099.917606] env[59518]: DEBUG nova.network.neutron [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] allocate_for_instance() {{(pid=59518) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1099.926912] env[59518]: DEBUG nova.compute.manager [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1099.964826] env[59518]: DEBUG nova.policy [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5377f4ef33d34772b9ad4f95b48e0f8e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c21421c641db4f98ac006a448a55852b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59518) authorize /opt/stack/nova/nova/policy.py:203}} [ 1099.992082] env[59518]: DEBUG nova.compute.manager [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1100.013860] env[59518]: DEBUG nova.virt.hardware [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1100.014277] env[59518]: DEBUG nova.virt.hardware [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1100.014437] env[59518]: DEBUG nova.virt.hardware [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1100.014614] env[59518]: DEBUG nova.virt.hardware [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1100.014755] env[59518]: DEBUG nova.virt.hardware [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1100.015071] env[59518]: DEBUG nova.virt.hardware [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1100.015339] env[59518]: DEBUG nova.virt.hardware [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1100.015705] env[59518]: DEBUG nova.virt.hardware [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1100.015930] env[59518]: DEBUG nova.virt.hardware [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1100.016268] env[59518]: DEBUG nova.virt.hardware [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1100.016478] env[59518]: DEBUG nova.virt.hardware [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1100.017301] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6333857b-f306-4399-8d45-9ee8dfb24333 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.025088] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73d86d8f-8935-43b2-b135-4be04e146a77 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.188023] env[59518]: DEBUG nova.network.neutron [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Successfully created port: 1eeb9a90-e052-4a24-a64f-a310a82b6cde {{(pid=59518) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1100.448049] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1100.622172] env[59518]: DEBUG nova.network.neutron [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Successfully updated port: 1eeb9a90-e052-4a24-a64f-a310a82b6cde {{(pid=59518) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1100.635281] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Acquiring lock "refresh_cache-468e2dc5-6a66-401d-b6cd-06bb94cea0ef" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1100.635409] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Acquired lock "refresh_cache-468e2dc5-6a66-401d-b6cd-06bb94cea0ef" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1100.635551] env[59518]: DEBUG nova.network.neutron [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1100.681530] env[59518]: DEBUG nova.network.neutron [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1100.843600] env[59518]: DEBUG nova.network.neutron [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Updating instance_info_cache with network_info: [{"id": "1eeb9a90-e052-4a24-a64f-a310a82b6cde", "address": "fa:16:3e:ab:2b:76", "network": {"id": "5ef4e5d1-1f82-4939-b230-c79e282b1110", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1409530798-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c21421c641db4f98ac006a448a55852b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f408ce42-3cac-4d9d-9c05-15471d653a18", "external-id": "nsx-vlan-transportzone-265", "segmentation_id": 265, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1eeb9a90-e0", "ovs_interfaceid": "1eeb9a90-e052-4a24-a64f-a310a82b6cde", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1100.856302] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Releasing lock "refresh_cache-468e2dc5-6a66-401d-b6cd-06bb94cea0ef" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1100.856591] env[59518]: DEBUG nova.compute.manager [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Instance network_info: |[{"id": "1eeb9a90-e052-4a24-a64f-a310a82b6cde", "address": "fa:16:3e:ab:2b:76", "network": {"id": "5ef4e5d1-1f82-4939-b230-c79e282b1110", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1409530798-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c21421c641db4f98ac006a448a55852b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f408ce42-3cac-4d9d-9c05-15471d653a18", "external-id": "nsx-vlan-transportzone-265", "segmentation_id": 265, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1eeb9a90-e0", "ovs_interfaceid": "1eeb9a90-e052-4a24-a64f-a310a82b6cde", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1100.856940] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ab:2b:76', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f408ce42-3cac-4d9d-9c05-15471d653a18', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1eeb9a90-e052-4a24-a64f-a310a82b6cde', 'vif_model': 'vmxnet3'}] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1100.864636] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Creating folder: Project (c21421c641db4f98ac006a448a55852b). Parent ref: group-v88807. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1100.865149] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-05509be1-039c-4b53-8dba-9f5b3e854e11 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.876596] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Created folder: Project (c21421c641db4f98ac006a448a55852b) in parent group-v88807. [ 1100.876741] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Creating folder: Instances. Parent ref: group-v88863. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1100.876950] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f2a946af-99d3-483a-8834-89bdd2eb6bec {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.884820] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Created folder: Instances in parent group-v88863. [ 1100.885026] env[59518]: DEBUG oslo.service.loopingcall [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1100.885194] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1100.885373] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7fa37bf3-f808-4d34-9420-7d4eeafcd252 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.904715] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1100.904715] env[59518]: value = "task-308003" [ 1100.904715] env[59518]: _type = "Task" [ 1100.904715] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1100.911434] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-308003, 'name': CreateVM_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1101.414804] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-308003, 'name': CreateVM_Task, 'duration_secs': 0.274218} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1101.415060] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1101.415610] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1101.415768] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1101.416113] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1101.416348] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-eff090c8-7292-46d4-b06f-23f073bfa753 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1101.420438] env[59518]: DEBUG oslo_vmware.api [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Waiting for the task: (returnval){ [ 1101.420438] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52f4d465-ba00-c5a5-43a6-ec2149a88850" [ 1101.420438] env[59518]: _type = "Task" [ 1101.420438] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1101.427194] env[59518]: DEBUG oslo_vmware.api [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52f4d465-ba00-c5a5-43a6-ec2149a88850, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1101.504776] env[59518]: DEBUG nova.compute.manager [req-99da459d-2e5f-4fb6-a404-66b5b09f1ad6 req-cfd5f942-24fe-4649-8fc0-f71819d26e31 service nova] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Received event network-vif-plugged-1eeb9a90-e052-4a24-a64f-a310a82b6cde {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1101.504988] env[59518]: DEBUG oslo_concurrency.lockutils [req-99da459d-2e5f-4fb6-a404-66b5b09f1ad6 req-cfd5f942-24fe-4649-8fc0-f71819d26e31 service nova] Acquiring lock "468e2dc5-6a66-401d-b6cd-06bb94cea0ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1101.505189] env[59518]: DEBUG oslo_concurrency.lockutils [req-99da459d-2e5f-4fb6-a404-66b5b09f1ad6 req-cfd5f942-24fe-4649-8fc0-f71819d26e31 service nova] Lock "468e2dc5-6a66-401d-b6cd-06bb94cea0ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1101.505347] env[59518]: DEBUG oslo_concurrency.lockutils [req-99da459d-2e5f-4fb6-a404-66b5b09f1ad6 req-cfd5f942-24fe-4649-8fc0-f71819d26e31 service nova] Lock "468e2dc5-6a66-401d-b6cd-06bb94cea0ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1101.505523] env[59518]: DEBUG nova.compute.manager [req-99da459d-2e5f-4fb6-a404-66b5b09f1ad6 req-cfd5f942-24fe-4649-8fc0-f71819d26e31 service nova] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] No waiting events found dispatching network-vif-plugged-1eeb9a90-e052-4a24-a64f-a310a82b6cde {{(pid=59518) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1101.505699] env[59518]: WARNING nova.compute.manager [req-99da459d-2e5f-4fb6-a404-66b5b09f1ad6 req-cfd5f942-24fe-4649-8fc0-f71819d26e31 service nova] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Received unexpected event network-vif-plugged-1eeb9a90-e052-4a24-a64f-a310a82b6cde for instance with vm_state building and task_state spawning. [ 1101.505850] env[59518]: DEBUG nova.compute.manager [req-99da459d-2e5f-4fb6-a404-66b5b09f1ad6 req-cfd5f942-24fe-4649-8fc0-f71819d26e31 service nova] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Received event network-changed-1eeb9a90-e052-4a24-a64f-a310a82b6cde {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1101.505994] env[59518]: DEBUG nova.compute.manager [req-99da459d-2e5f-4fb6-a404-66b5b09f1ad6 req-cfd5f942-24fe-4649-8fc0-f71819d26e31 service nova] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Refreshing instance network info cache due to event network-changed-1eeb9a90-e052-4a24-a64f-a310a82b6cde. {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 1101.506165] env[59518]: DEBUG oslo_concurrency.lockutils [req-99da459d-2e5f-4fb6-a404-66b5b09f1ad6 req-cfd5f942-24fe-4649-8fc0-f71819d26e31 service nova] Acquiring lock "refresh_cache-468e2dc5-6a66-401d-b6cd-06bb94cea0ef" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1101.506284] env[59518]: DEBUG oslo_concurrency.lockutils [req-99da459d-2e5f-4fb6-a404-66b5b09f1ad6 req-cfd5f942-24fe-4649-8fc0-f71819d26e31 service nova] Acquired lock "refresh_cache-468e2dc5-6a66-401d-b6cd-06bb94cea0ef" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1101.506424] env[59518]: DEBUG nova.network.neutron [req-99da459d-2e5f-4fb6-a404-66b5b09f1ad6 req-cfd5f942-24fe-4649-8fc0-f71819d26e31 service nova] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Refreshing network info cache for port 1eeb9a90-e052-4a24-a64f-a310a82b6cde {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1101.692621] env[59518]: DEBUG nova.network.neutron [req-99da459d-2e5f-4fb6-a404-66b5b09f1ad6 req-cfd5f942-24fe-4649-8fc0-f71819d26e31 service nova] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Updated VIF entry in instance network info cache for port 1eeb9a90-e052-4a24-a64f-a310a82b6cde. {{(pid=59518) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1101.692955] env[59518]: DEBUG nova.network.neutron [req-99da459d-2e5f-4fb6-a404-66b5b09f1ad6 req-cfd5f942-24fe-4649-8fc0-f71819d26e31 service nova] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Updating instance_info_cache with network_info: [{"id": "1eeb9a90-e052-4a24-a64f-a310a82b6cde", "address": "fa:16:3e:ab:2b:76", "network": {"id": "5ef4e5d1-1f82-4939-b230-c79e282b1110", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1409530798-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c21421c641db4f98ac006a448a55852b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f408ce42-3cac-4d9d-9c05-15471d653a18", "external-id": "nsx-vlan-transportzone-265", "segmentation_id": 265, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1eeb9a90-e0", "ovs_interfaceid": "1eeb9a90-e052-4a24-a64f-a310a82b6cde", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1101.701798] env[59518]: DEBUG oslo_concurrency.lockutils [req-99da459d-2e5f-4fb6-a404-66b5b09f1ad6 req-cfd5f942-24fe-4649-8fc0-f71819d26e31 service nova] Releasing lock "refresh_cache-468e2dc5-6a66-401d-b6cd-06bb94cea0ef" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1101.930307] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1101.930596] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1101.930713] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1102.447983] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1102.448209] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Starting heal instance info cache {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1102.448375] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Rebuilding the list of instances to heal {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1102.464056] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1102.464209] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1102.464333] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1102.464456] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1102.464574] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1102.464690] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1102.464804] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Didn't find any instances for network info cache update. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1103.448071] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1103.448474] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1104.448168] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1104.448574] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1104.448574] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59518) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1105.447921] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1105.457645] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1105.458005] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1105.458005] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1105.458278] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59518) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1105.459370] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89cd8a40-6bdf-4213-b920-c711ef6d9630 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.468651] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18f4c377-2640-424b-a158-1e3778aa1b9b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.482434] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52245ada-5ba2-4a3c-997a-44bdc2a440a8 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.488489] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-718c7953-f896-415b-96f5-37e02e886008 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.517521] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181748MB free_disk=174GB free_vcpus=48 pci_devices=None {{(pid=59518) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1105.517665] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1105.517820] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1105.571925] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 282b61db-76cd-44c3-b500-7a465e903c97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1105.572091] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 3e59b5d7-978d-405a-b68a-47ee03b9a713 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1105.572221] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance af4b8dd9-a05d-427e-a147-76c7cfec5862 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1105.572337] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance ae88d565-bbf5-4c29-aee9-364c23086de5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1105.572456] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 39cfe606-43a0-4a52-8ec1-433baf7a3aec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1105.572569] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 468e2dc5-6a66-401d-b6cd-06bb94cea0ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1105.572738] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1105.572868] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1105.644702] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6881e60a-8f3f-4aac-b65c-090f6a5c464b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.651917] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4df3ffed-fd67-4485-acb6-d75ca8e56891 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.680055] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-639112a3-fece-4fcc-8a5d-d3f3484c7c16 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.686634] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-207297ac-d70d-4895-9682-db73efa4a967 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1105.699136] env[59518]: DEBUG nova.compute.provider_tree [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1105.707343] env[59518]: DEBUG nova.scheduler.client.report [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1105.719607] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59518) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1105.719769] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.202s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1106.714915] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1146.468753] env[59518]: DEBUG nova.compute.manager [req-bdca0051-5329-46c9-bb47-8efa53043082 req-d0a3b1ef-399f-42c7-82ec-ab41e105f3b1 service nova] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Received event network-vif-deleted-1ad5bfc1-f374-42b5-8ca7-d91415bebbdd {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1146.884753] env[59518]: WARNING oslo_vmware.rw_handles [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1146.884753] env[59518]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1146.884753] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1146.884753] env[59518]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1146.884753] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1146.884753] env[59518]: ERROR oslo_vmware.rw_handles response.begin() [ 1146.884753] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1146.884753] env[59518]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1146.884753] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1146.884753] env[59518]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1146.884753] env[59518]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1146.884753] env[59518]: ERROR oslo_vmware.rw_handles [ 1146.885076] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Downloaded image file data e70539a9-144d-4900-807e-914ae0cc8539 to vmware_temp/f74312a4-bd7c-46e7-b8bf-8fc7f633482c/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1146.886937] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Caching image {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1146.887282] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Copying Virtual Disk [datastore1] vmware_temp/f74312a4-bd7c-46e7-b8bf-8fc7f633482c/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk to [datastore1] vmware_temp/f74312a4-bd7c-46e7-b8bf-8fc7f633482c/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk {{(pid=59518) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1146.887663] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f8f9c9cd-d69e-491d-acdf-0f5f60b1cbe0 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.895865] env[59518]: DEBUG oslo_vmware.api [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Waiting for the task: (returnval){ [ 1146.895865] env[59518]: value = "task-308004" [ 1146.895865] env[59518]: _type = "Task" [ 1146.895865] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1146.903520] env[59518]: DEBUG oslo_vmware.api [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Task: {'id': task-308004, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1147.405695] env[59518]: DEBUG oslo_vmware.exceptions [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Fault InvalidArgument not matched. {{(pid=59518) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1147.405923] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1147.406450] env[59518]: ERROR nova.compute.manager [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1147.406450] env[59518]: Faults: ['InvalidArgument'] [ 1147.406450] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Traceback (most recent call last): [ 1147.406450] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1147.406450] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] yield resources [ 1147.406450] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1147.406450] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] self.driver.spawn(context, instance, image_meta, [ 1147.406450] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1147.406450] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1147.406450] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1147.406450] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] self._fetch_image_if_missing(context, vi) [ 1147.406450] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1147.406872] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] image_cache(vi, tmp_image_ds_loc) [ 1147.406872] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1147.406872] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] vm_util.copy_virtual_disk( [ 1147.406872] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1147.406872] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] session._wait_for_task(vmdk_copy_task) [ 1147.406872] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1147.406872] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] return self.wait_for_task(task_ref) [ 1147.406872] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1147.406872] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] return evt.wait() [ 1147.406872] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1147.406872] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] result = hub.switch() [ 1147.406872] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1147.406872] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] return self.greenlet.switch() [ 1147.407217] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1147.407217] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] self.f(*self.args, **self.kw) [ 1147.407217] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1147.407217] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] raise exceptions.translate_fault(task_info.error) [ 1147.407217] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1147.407217] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Faults: ['InvalidArgument'] [ 1147.407217] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] [ 1147.407217] env[59518]: INFO nova.compute.manager [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Terminating instance [ 1147.408317] env[59518]: DEBUG oslo_concurrency.lockutils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1147.408516] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1147.409006] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Acquiring lock "refresh_cache-282b61db-76cd-44c3-b500-7a465e903c97" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1147.409164] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Acquired lock "refresh_cache-282b61db-76cd-44c3-b500-7a465e903c97" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1147.409322] env[59518]: DEBUG nova.network.neutron [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1147.410205] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-33a86aa0-fa6a-4be3-983f-aca76efc47d8 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.420475] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1147.420648] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1147.421640] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c16b31c4-eccc-4b35-b5ce-285edf75d01f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.427544] env[59518]: DEBUG oslo_vmware.api [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Waiting for the task: (returnval){ [ 1147.427544] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52f0b8f6-3a9d-e152-391e-42e6d0675c3a" [ 1147.427544] env[59518]: _type = "Task" [ 1147.427544] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1147.434242] env[59518]: DEBUG oslo_vmware.api [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52f0b8f6-3a9d-e152-391e-42e6d0675c3a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1147.436180] env[59518]: DEBUG nova.network.neutron [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1147.483248] env[59518]: DEBUG nova.network.neutron [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1147.493696] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Releasing lock "refresh_cache-282b61db-76cd-44c3-b500-7a465e903c97" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1147.494096] env[59518]: DEBUG nova.compute.manager [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1147.494283] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1147.495292] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ca0d689-6e27-48c5-b639-8afdaf273255 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.502774] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1147.502983] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d4c51e4a-6b78-4367-a15a-78987e186827 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.538404] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1147.538605] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1147.538779] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Deleting the datastore file [datastore1] 282b61db-76cd-44c3-b500-7a465e903c97 {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1147.539030] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-39e19c46-9188-433a-95b7-ef33f3468614 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.545856] env[59518]: DEBUG oslo_vmware.api [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Waiting for the task: (returnval){ [ 1147.545856] env[59518]: value = "task-308006" [ 1147.545856] env[59518]: _type = "Task" [ 1147.545856] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1147.553929] env[59518]: DEBUG oslo_vmware.api [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Task: {'id': task-308006, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1147.937974] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1147.938263] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Creating directory with path [datastore1] vmware_temp/bf310bd9-c101-4eba-85f3-13097e16ac92/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1147.938517] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c8c1db70-5bc1-43d1-b01b-e7228be88774 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.949260] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Created directory with path [datastore1] vmware_temp/bf310bd9-c101-4eba-85f3-13097e16ac92/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1147.949437] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Fetch image to [datastore1] vmware_temp/bf310bd9-c101-4eba-85f3-13097e16ac92/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1147.949641] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/bf310bd9-c101-4eba-85f3-13097e16ac92/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1147.950406] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff0c4c0b-bd7b-4506-9aba-f61307d29a2a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.957535] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7fd00a3-05c6-4a8b-bebc-bdf3a203d990 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.968327] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27e6449c-558a-49f2-a7c3-db100272bb70 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.000662] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-788d5ff3-6060-4608-9d66-d0458c90a2d7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.006938] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a15e08ce-00bd-40eb-b7ca-9a9ca4c7e745 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.028524] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1148.059938] env[59518]: DEBUG oslo_vmware.api [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Task: {'id': task-308006, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.030409} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1148.060220] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1148.060429] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1148.060631] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1148.060812] env[59518]: INFO nova.compute.manager [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Took 0.57 seconds to destroy the instance on the hypervisor. [ 1148.061200] env[59518]: DEBUG oslo.service.loopingcall [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1148.061436] env[59518]: DEBUG nova.compute.manager [-] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Skipping network deallocation for instance since networking was not requested. {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1148.065344] env[59518]: DEBUG nova.compute.claims [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1148.065536] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1148.065761] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1148.078667] env[59518]: DEBUG oslo_vmware.rw_handles [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/bf310bd9-c101-4eba-85f3-13097e16ac92/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1148.135530] env[59518]: DEBUG oslo_vmware.rw_handles [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Completed reading data from the image iterator. {{(pid=59518) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1148.135882] env[59518]: DEBUG oslo_vmware.rw_handles [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/bf310bd9-c101-4eba-85f3-13097e16ac92/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1148.200985] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e82cf2a2-549c-4330-8adf-f58b12596c9a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.208275] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-965e1e60-5260-41fa-940f-bd6c4c842eda {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.238277] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d04c00e7-3133-478b-9465-a9988b239c7d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.245496] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6621702-c713-40bd-a6c4-1b5e93d91301 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.258127] env[59518]: DEBUG nova.compute.provider_tree [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1148.266735] env[59518]: DEBUG nova.scheduler.client.report [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1148.279712] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.214s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1148.280302] env[59518]: ERROR nova.compute.manager [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1148.280302] env[59518]: Faults: ['InvalidArgument'] [ 1148.280302] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Traceback (most recent call last): [ 1148.280302] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1148.280302] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] self.driver.spawn(context, instance, image_meta, [ 1148.280302] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1148.280302] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1148.280302] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1148.280302] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] self._fetch_image_if_missing(context, vi) [ 1148.280302] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1148.280302] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] image_cache(vi, tmp_image_ds_loc) [ 1148.280302] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1148.280641] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] vm_util.copy_virtual_disk( [ 1148.280641] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1148.280641] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] session._wait_for_task(vmdk_copy_task) [ 1148.280641] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1148.280641] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] return self.wait_for_task(task_ref) [ 1148.280641] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1148.280641] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] return evt.wait() [ 1148.280641] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1148.280641] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] result = hub.switch() [ 1148.280641] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1148.280641] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] return self.greenlet.switch() [ 1148.280641] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1148.280641] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] self.f(*self.args, **self.kw) [ 1148.280932] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1148.280932] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] raise exceptions.translate_fault(task_info.error) [ 1148.280932] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1148.280932] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Faults: ['InvalidArgument'] [ 1148.280932] env[59518]: ERROR nova.compute.manager [instance: 282b61db-76cd-44c3-b500-7a465e903c97] [ 1148.281051] env[59518]: DEBUG nova.compute.utils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] VimFaultException {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1148.282775] env[59518]: DEBUG nova.compute.manager [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Build of instance 282b61db-76cd-44c3-b500-7a465e903c97 was re-scheduled: A specified parameter was not correct: fileType [ 1148.282775] env[59518]: Faults: ['InvalidArgument'] {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1148.283168] env[59518]: DEBUG nova.compute.manager [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1148.283390] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Acquiring lock "refresh_cache-282b61db-76cd-44c3-b500-7a465e903c97" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1148.283527] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Acquired lock "refresh_cache-282b61db-76cd-44c3-b500-7a465e903c97" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1148.283679] env[59518]: DEBUG nova.network.neutron [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1148.377961] env[59518]: DEBUG nova.network.neutron [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1148.482417] env[59518]: DEBUG nova.network.neutron [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1148.492485] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Releasing lock "refresh_cache-282b61db-76cd-44c3-b500-7a465e903c97" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1148.493142] env[59518]: DEBUG nova.compute.manager [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1148.493704] env[59518]: DEBUG nova.compute.manager [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Skipping network deallocation for instance since networking was not requested. {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1148.581060] env[59518]: INFO nova.scheduler.client.report [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Deleted allocations for instance 282b61db-76cd-44c3-b500-7a465e903c97 [ 1148.596098] env[59518]: DEBUG oslo_concurrency.lockutils [None req-f7370d0e-ce7e-4489-92d5-ccfe0579d961 tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Lock "282b61db-76cd-44c3-b500-7a465e903c97" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 530.312s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1148.596327] env[59518]: DEBUG oslo_concurrency.lockutils [None req-5e7f2c47-9909-46bd-bf47-550c5524d4fe tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Lock "282b61db-76cd-44c3-b500-7a465e903c97" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 328.895s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1148.596542] env[59518]: DEBUG oslo_concurrency.lockutils [None req-5e7f2c47-9909-46bd-bf47-550c5524d4fe tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Acquiring lock "282b61db-76cd-44c3-b500-7a465e903c97-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1148.596735] env[59518]: DEBUG oslo_concurrency.lockutils [None req-5e7f2c47-9909-46bd-bf47-550c5524d4fe tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Lock "282b61db-76cd-44c3-b500-7a465e903c97-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1148.596886] env[59518]: DEBUG oslo_concurrency.lockutils [None req-5e7f2c47-9909-46bd-bf47-550c5524d4fe tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Lock "282b61db-76cd-44c3-b500-7a465e903c97-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1148.598619] env[59518]: INFO nova.compute.manager [None req-5e7f2c47-9909-46bd-bf47-550c5524d4fe tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Terminating instance [ 1148.601281] env[59518]: DEBUG oslo_concurrency.lockutils [None req-5e7f2c47-9909-46bd-bf47-550c5524d4fe tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Acquiring lock "refresh_cache-282b61db-76cd-44c3-b500-7a465e903c97" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1148.601281] env[59518]: DEBUG oslo_concurrency.lockutils [None req-5e7f2c47-9909-46bd-bf47-550c5524d4fe tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Acquired lock "refresh_cache-282b61db-76cd-44c3-b500-7a465e903c97" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1148.601281] env[59518]: DEBUG nova.network.neutron [None req-5e7f2c47-9909-46bd-bf47-550c5524d4fe tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1148.633444] env[59518]: DEBUG nova.network.neutron [None req-5e7f2c47-9909-46bd-bf47-550c5524d4fe tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1148.682058] env[59518]: DEBUG nova.network.neutron [None req-5e7f2c47-9909-46bd-bf47-550c5524d4fe tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1148.690600] env[59518]: DEBUG oslo_concurrency.lockutils [None req-5e7f2c47-9909-46bd-bf47-550c5524d4fe tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Releasing lock "refresh_cache-282b61db-76cd-44c3-b500-7a465e903c97" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1148.691002] env[59518]: DEBUG nova.compute.manager [None req-5e7f2c47-9909-46bd-bf47-550c5524d4fe tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1148.691236] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-5e7f2c47-9909-46bd-bf47-550c5524d4fe tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1148.691732] env[59518]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d33f7926-304f-4d71-be36-6e8fc61d42ce {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.700837] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1aecbb85-d7a4-483a-8127-1c40c72a9241 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.729485] env[59518]: WARNING nova.virt.vmwareapi.vmops [None req-5e7f2c47-9909-46bd-bf47-550c5524d4fe tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 282b61db-76cd-44c3-b500-7a465e903c97 could not be found. [ 1148.729630] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-5e7f2c47-9909-46bd-bf47-550c5524d4fe tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1148.729801] env[59518]: INFO nova.compute.manager [None req-5e7f2c47-9909-46bd-bf47-550c5524d4fe tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1148.730055] env[59518]: DEBUG oslo.service.loopingcall [None req-5e7f2c47-9909-46bd-bf47-550c5524d4fe tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1148.730261] env[59518]: DEBUG nova.compute.manager [-] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1148.730355] env[59518]: DEBUG nova.network.neutron [-] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1148.746519] env[59518]: DEBUG nova.network.neutron [-] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1148.752917] env[59518]: DEBUG nova.network.neutron [-] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1148.760478] env[59518]: INFO nova.compute.manager [-] [instance: 282b61db-76cd-44c3-b500-7a465e903c97] Took 0.03 seconds to deallocate network for instance. [ 1148.840732] env[59518]: DEBUG oslo_concurrency.lockutils [None req-5e7f2c47-9909-46bd-bf47-550c5524d4fe tempest-ServersAdmin275Test-1556032142 tempest-ServersAdmin275Test-1556032142-project-member] Lock "282b61db-76cd-44c3-b500-7a465e903c97" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.244s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1157.449417] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1157.449417] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1158.454528] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1158.454875] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Cleaning up deleted instances with incomplete migration {{(pid=59518) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11133}} [ 1160.456761] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1162.448613] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1162.449065] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Starting heal instance info cache {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1162.449065] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Rebuilding the list of instances to heal {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1162.462847] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1162.463026] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1162.463123] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1162.463249] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1162.463368] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Didn't find any instances for network info cache update. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1163.908031] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Acquiring lock "8b692644-9080-4ffd-89b3-c8cd64de0e4f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1163.908309] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Lock "8b692644-9080-4ffd-89b3-c8cd64de0e4f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1163.917481] env[59518]: DEBUG nova.compute.manager [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Starting instance... {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1164.021440] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1164.021676] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1164.023158] env[59518]: INFO nova.compute.claims [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1164.080998] env[59518]: DEBUG nova.scheduler.client.report [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Refreshing inventories for resource provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1164.095320] env[59518]: DEBUG nova.scheduler.client.report [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Updating ProviderTree inventory for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1164.095533] env[59518]: DEBUG nova.compute.provider_tree [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Updating inventory in ProviderTree for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1164.106032] env[59518]: DEBUG nova.scheduler.client.report [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Refreshing aggregate associations for resource provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd, aggregates: None {{(pid=59518) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1164.121877] env[59518]: DEBUG nova.scheduler.client.report [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Refreshing trait associations for resource provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE {{(pid=59518) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1164.181852] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf0ce132-4fbc-4910-9bc2-0631830b2cd8 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1164.189381] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be01d5c4-0513-436c-8a41-a31cd6ec07ea {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1164.219867] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0cdf6b3-1049-466c-8e2a-10ba883dcb52 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1164.226535] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a08564b6-010a-4212-8532-d9756aa62c0d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1164.240420] env[59518]: DEBUG nova.compute.provider_tree [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1164.248522] env[59518]: DEBUG nova.scheduler.client.report [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1164.261160] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.239s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1164.261684] env[59518]: DEBUG nova.compute.manager [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Start building networks asynchronously for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1164.293624] env[59518]: DEBUG nova.compute.utils [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Using /dev/sd instead of None {{(pid=59518) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1164.295551] env[59518]: DEBUG nova.compute.manager [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Allocating IP information in the background. {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1164.295721] env[59518]: DEBUG nova.network.neutron [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] allocate_for_instance() {{(pid=59518) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1164.303661] env[59518]: DEBUG nova.compute.manager [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Start building block device mappings for instance. {{(pid=59518) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1164.340818] env[59518]: DEBUG nova.policy [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cfe666a823a847edb4c7a1f7eb9ec480', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c538bbfbf074f0c81047b7f2c86c353', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59518) authorize /opt/stack/nova/nova/policy.py:203}} [ 1164.361419] env[59518]: DEBUG nova.compute.manager [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Start spawning the instance on the hypervisor. {{(pid=59518) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1164.382687] env[59518]: DEBUG nova.virt.hardware [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-05-28T13:15:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-05-28T13:15:21Z,direct_url=,disk_format='vmdk',id=e70539a9-144d-4900-807e-914ae0cc8539,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='33bab75ff2cf45ecb4ab54af3adf83ee',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-05-28T13:15:22Z,virtual_size=,visibility=), allow threads: False {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1164.382935] env[59518]: DEBUG nova.virt.hardware [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Flavor limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1164.383081] env[59518]: DEBUG nova.virt.hardware [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Image limits 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1164.383250] env[59518]: DEBUG nova.virt.hardware [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Flavor pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1164.383383] env[59518]: DEBUG nova.virt.hardware [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Image pref 0:0:0 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1164.383514] env[59518]: DEBUG nova.virt.hardware [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59518) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1164.383707] env[59518]: DEBUG nova.virt.hardware [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1164.383900] env[59518]: DEBUG nova.virt.hardware [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1164.384077] env[59518]: DEBUG nova.virt.hardware [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Got 1 possible topologies {{(pid=59518) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1164.384241] env[59518]: DEBUG nova.virt.hardware [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1164.384397] env[59518]: DEBUG nova.virt.hardware [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59518) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1164.385222] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c24a1e43-2443-42b0-afa8-755d9e33eb28 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1164.392864] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8eed8dd4-a50d-44cf-814b-d71e4beeba13 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1164.567370] env[59518]: DEBUG nova.network.neutron [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Successfully created port: 31e590b2-078f-4f45-a603-0ba5a409b415 {{(pid=59518) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1164.988490] env[59518]: DEBUG nova.compute.manager [req-35413fc2-23b1-4ef0-8ff2-1038c8ee472a req-63c59651-5d53-4dd8-b864-7707ddb711dc service nova] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Received event network-vif-plugged-31e590b2-078f-4f45-a603-0ba5a409b415 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1164.988740] env[59518]: DEBUG oslo_concurrency.lockutils [req-35413fc2-23b1-4ef0-8ff2-1038c8ee472a req-63c59651-5d53-4dd8-b864-7707ddb711dc service nova] Acquiring lock "8b692644-9080-4ffd-89b3-c8cd64de0e4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1164.988898] env[59518]: DEBUG oslo_concurrency.lockutils [req-35413fc2-23b1-4ef0-8ff2-1038c8ee472a req-63c59651-5d53-4dd8-b864-7707ddb711dc service nova] Lock "8b692644-9080-4ffd-89b3-c8cd64de0e4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1164.989050] env[59518]: DEBUG oslo_concurrency.lockutils [req-35413fc2-23b1-4ef0-8ff2-1038c8ee472a req-63c59651-5d53-4dd8-b864-7707ddb711dc service nova] Lock "8b692644-9080-4ffd-89b3-c8cd64de0e4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1164.989223] env[59518]: DEBUG nova.compute.manager [req-35413fc2-23b1-4ef0-8ff2-1038c8ee472a req-63c59651-5d53-4dd8-b864-7707ddb711dc service nova] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] No waiting events found dispatching network-vif-plugged-31e590b2-078f-4f45-a603-0ba5a409b415 {{(pid=59518) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1164.989376] env[59518]: WARNING nova.compute.manager [req-35413fc2-23b1-4ef0-8ff2-1038c8ee472a req-63c59651-5d53-4dd8-b864-7707ddb711dc service nova] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Received unexpected event network-vif-plugged-31e590b2-078f-4f45-a603-0ba5a409b415 for instance with vm_state building and task_state spawning. [ 1165.047715] env[59518]: DEBUG nova.network.neutron [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Successfully updated port: 31e590b2-078f-4f45-a603-0ba5a409b415 {{(pid=59518) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1165.055472] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Acquiring lock "refresh_cache-8b692644-9080-4ffd-89b3-c8cd64de0e4f" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1165.055600] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Acquired lock "refresh_cache-8b692644-9080-4ffd-89b3-c8cd64de0e4f" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1165.055743] env[59518]: DEBUG nova.network.neutron [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1165.091341] env[59518]: DEBUG nova.network.neutron [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1165.217374] env[59518]: DEBUG nova.network.neutron [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Updating instance_info_cache with network_info: [{"id": "31e590b2-078f-4f45-a603-0ba5a409b415", "address": "fa:16:3e:0a:51:50", "network": {"id": "db8162d7-6cd0-4097-8267-c71dc3c125b7", "bridge": "br-int", "label": "tempest-ServersTestJSON-567571352-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c538bbfbf074f0c81047b7f2c86c353", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "86a35d07-53d3-46b3-92cb-ae34236c0f41", "external-id": "nsx-vlan-transportzone-811", "segmentation_id": 811, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap31e590b2-07", "ovs_interfaceid": "31e590b2-078f-4f45-a603-0ba5a409b415", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1165.230196] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Releasing lock "refresh_cache-8b692644-9080-4ffd-89b3-c8cd64de0e4f" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1165.230454] env[59518]: DEBUG nova.compute.manager [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Instance network_info: |[{"id": "31e590b2-078f-4f45-a603-0ba5a409b415", "address": "fa:16:3e:0a:51:50", "network": {"id": "db8162d7-6cd0-4097-8267-c71dc3c125b7", "bridge": "br-int", "label": "tempest-ServersTestJSON-567571352-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c538bbfbf074f0c81047b7f2c86c353", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "86a35d07-53d3-46b3-92cb-ae34236c0f41", "external-id": "nsx-vlan-transportzone-811", "segmentation_id": 811, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap31e590b2-07", "ovs_interfaceid": "31e590b2-078f-4f45-a603-0ba5a409b415", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59518) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1165.230800] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0a:51:50', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '86a35d07-53d3-46b3-92cb-ae34236c0f41', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '31e590b2-078f-4f45-a603-0ba5a409b415', 'vif_model': 'vmxnet3'}] {{(pid=59518) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1165.238114] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Creating folder: Project (0c538bbfbf074f0c81047b7f2c86c353). Parent ref: group-v88807. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1165.238551] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ffd8e766-44c9-4550-b29f-b2de5be85779 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1165.249277] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Created folder: Project (0c538bbfbf074f0c81047b7f2c86c353) in parent group-v88807. [ 1165.249444] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Creating folder: Instances. Parent ref: group-v88866. {{(pid=59518) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1165.249642] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bf13801d-a984-4882-8856-939785ae8282 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1165.257482] env[59518]: INFO nova.virt.vmwareapi.vm_util [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Created folder: Instances in parent group-v88866. [ 1165.257656] env[59518]: DEBUG oslo.service.loopingcall [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1165.257814] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Creating VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1165.257982] env[59518]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-76bd5732-846b-408e-8300-c6e53ab213f0 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1165.275506] env[59518]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1165.275506] env[59518]: value = "task-308009" [ 1165.275506] env[59518]: _type = "Task" [ 1165.275506] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1165.282563] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-308009, 'name': CreateVM_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1165.448181] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1165.465116] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1165.465319] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1165.785143] env[59518]: DEBUG oslo_vmware.api [-] Task: {'id': task-308009, 'name': CreateVM_Task, 'duration_secs': 0.276354} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1165.785325] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Created VM on the ESX host {{(pid=59518) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1165.785981] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1165.786135] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1165.786433] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1165.786660] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e7ed0e65-5459-45d5-b4a1-696c551716b6 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1165.790698] env[59518]: DEBUG oslo_vmware.api [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Waiting for the task: (returnval){ [ 1165.790698] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52d139ca-e77a-e3b9-11ed-03a71c94ecf4" [ 1165.790698] env[59518]: _type = "Task" [ 1165.790698] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1165.797672] env[59518]: DEBUG oslo_vmware.api [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52d139ca-e77a-e3b9-11ed-03a71c94ecf4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1166.300697] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1166.301070] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Processing image e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1166.301201] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1166.447954] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1166.448244] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1166.448424] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59518) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1167.021260] env[59518]: DEBUG nova.compute.manager [req-0a04850d-184f-41e1-849f-8a3e39259de6 req-81fb9cf0-4ba3-42fd-8299-ad210a0ad4ce service nova] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Received event network-changed-31e590b2-078f-4f45-a603-0ba5a409b415 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1167.021631] env[59518]: DEBUG nova.compute.manager [req-0a04850d-184f-41e1-849f-8a3e39259de6 req-81fb9cf0-4ba3-42fd-8299-ad210a0ad4ce service nova] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Refreshing instance network info cache due to event network-changed-31e590b2-078f-4f45-a603-0ba5a409b415. {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 1167.021974] env[59518]: DEBUG oslo_concurrency.lockutils [req-0a04850d-184f-41e1-849f-8a3e39259de6 req-81fb9cf0-4ba3-42fd-8299-ad210a0ad4ce service nova] Acquiring lock "refresh_cache-8b692644-9080-4ffd-89b3-c8cd64de0e4f" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1167.022219] env[59518]: DEBUG oslo_concurrency.lockutils [req-0a04850d-184f-41e1-849f-8a3e39259de6 req-81fb9cf0-4ba3-42fd-8299-ad210a0ad4ce service nova] Acquired lock "refresh_cache-8b692644-9080-4ffd-89b3-c8cd64de0e4f" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1167.022477] env[59518]: DEBUG nova.network.neutron [req-0a04850d-184f-41e1-849f-8a3e39259de6 req-81fb9cf0-4ba3-42fd-8299-ad210a0ad4ce service nova] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Refreshing network info cache for port 31e590b2-078f-4f45-a603-0ba5a409b415 {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1167.244135] env[59518]: DEBUG nova.network.neutron [req-0a04850d-184f-41e1-849f-8a3e39259de6 req-81fb9cf0-4ba3-42fd-8299-ad210a0ad4ce service nova] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Updated VIF entry in instance network info cache for port 31e590b2-078f-4f45-a603-0ba5a409b415. {{(pid=59518) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1167.244709] env[59518]: DEBUG nova.network.neutron [req-0a04850d-184f-41e1-849f-8a3e39259de6 req-81fb9cf0-4ba3-42fd-8299-ad210a0ad4ce service nova] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Updating instance_info_cache with network_info: [{"id": "31e590b2-078f-4f45-a603-0ba5a409b415", "address": "fa:16:3e:0a:51:50", "network": {"id": "db8162d7-6cd0-4097-8267-c71dc3c125b7", "bridge": "br-int", "label": "tempest-ServersTestJSON-567571352-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c538bbfbf074f0c81047b7f2c86c353", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "86a35d07-53d3-46b3-92cb-ae34236c0f41", "external-id": "nsx-vlan-transportzone-811", "segmentation_id": 811, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap31e590b2-07", "ovs_interfaceid": "31e590b2-078f-4f45-a603-0ba5a409b415", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1167.255295] env[59518]: DEBUG oslo_concurrency.lockutils [req-0a04850d-184f-41e1-849f-8a3e39259de6 req-81fb9cf0-4ba3-42fd-8299-ad210a0ad4ce service nova] Releasing lock "refresh_cache-8b692644-9080-4ffd-89b3-c8cd64de0e4f" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1167.447438] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1167.457156] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1167.457513] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1167.457795] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1167.458049] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59518) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1167.459911] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22be8f1e-8c4b-441b-a918-324b899d4bb5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1167.468271] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-137f7286-2c2b-4760-9fde-20131a4c2015 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1167.483132] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28cd65af-a2ab-47cc-b8ab-221aff431f2a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1167.489295] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78afcf61-9006-4a78-b17c-911ba6c35c8c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1167.518660] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181722MB free_disk=174GB free_vcpus=48 pci_devices=None {{(pid=59518) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1167.518947] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1167.519240] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1167.565781] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 3e59b5d7-978d-405a-b68a-47ee03b9a713 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1167.565940] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance af4b8dd9-a05d-427e-a147-76c7cfec5862 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1167.566066] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance ae88d565-bbf5-4c29-aee9-364c23086de5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1167.566183] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 468e2dc5-6a66-401d-b6cd-06bb94cea0ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1167.566300] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 8b692644-9080-4ffd-89b3-c8cd64de0e4f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1167.566466] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1167.566596] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1167.627986] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32e39479-a931-4634-b026-2e4f4b387c65 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1167.635075] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f0cc226-95c6-40c1-9666-e1f086998ee0 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1167.662885] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24a4c634-f1e7-4d16-8e60-cb48a7665860 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1167.669331] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb8ea200-59ba-46d4-b326-468d266dc0ad {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1167.682671] env[59518]: DEBUG nova.compute.provider_tree [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1167.690259] env[59518]: DEBUG nova.scheduler.client.report [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1167.704190] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59518) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1167.704353] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.185s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1167.704546] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1167.704676] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Cleaning up deleted instances {{(pid=59518) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11095}} [ 1167.723366] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] There are 5 instances to clean {{(pid=59518) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11104}} [ 1167.723449] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Instance has had 0 of 5 cleanup attempts {{(pid=59518) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1167.755848] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Instance has had 0 of 5 cleanup attempts {{(pid=59518) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1167.775250] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Instance has had 0 of 5 cleanup attempts {{(pid=59518) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1167.795745] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Instance has had 0 of 5 cleanup attempts {{(pid=59518) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1167.814524] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Instance has had 0 of 5 cleanup attempts {{(pid=59518) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1168.828565] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1195.171458] env[59518]: WARNING oslo_vmware.rw_handles [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1195.171458] env[59518]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1195.171458] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1195.171458] env[59518]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1195.171458] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1195.171458] env[59518]: ERROR oslo_vmware.rw_handles response.begin() [ 1195.171458] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1195.171458] env[59518]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1195.171458] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1195.171458] env[59518]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1195.171458] env[59518]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1195.171458] env[59518]: ERROR oslo_vmware.rw_handles [ 1195.172079] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Downloaded image file data e70539a9-144d-4900-807e-914ae0cc8539 to vmware_temp/bf310bd9-c101-4eba-85f3-13097e16ac92/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1195.173546] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Caching image {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1195.173785] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Copying Virtual Disk [datastore1] vmware_temp/bf310bd9-c101-4eba-85f3-13097e16ac92/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk to [datastore1] vmware_temp/bf310bd9-c101-4eba-85f3-13097e16ac92/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk {{(pid=59518) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1195.174093] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d6c20d71-b67a-4327-96bf-5dd3ae2c24d3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1195.181742] env[59518]: DEBUG oslo_vmware.api [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Waiting for the task: (returnval){ [ 1195.181742] env[59518]: value = "task-308010" [ 1195.181742] env[59518]: _type = "Task" [ 1195.181742] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1195.189207] env[59518]: DEBUG oslo_vmware.api [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Task: {'id': task-308010, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1195.693604] env[59518]: DEBUG oslo_vmware.exceptions [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Fault InvalidArgument not matched. {{(pid=59518) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1195.693836] env[59518]: DEBUG oslo_concurrency.lockutils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1195.694438] env[59518]: ERROR nova.compute.manager [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1195.694438] env[59518]: Faults: ['InvalidArgument'] [ 1195.694438] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Traceback (most recent call last): [ 1195.694438] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1195.694438] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] yield resources [ 1195.694438] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1195.694438] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] self.driver.spawn(context, instance, image_meta, [ 1195.694438] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1195.694438] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1195.694438] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1195.694438] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] self._fetch_image_if_missing(context, vi) [ 1195.694438] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1195.694784] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] image_cache(vi, tmp_image_ds_loc) [ 1195.694784] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1195.694784] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] vm_util.copy_virtual_disk( [ 1195.694784] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1195.694784] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] session._wait_for_task(vmdk_copy_task) [ 1195.694784] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1195.694784] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] return self.wait_for_task(task_ref) [ 1195.694784] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1195.694784] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] return evt.wait() [ 1195.694784] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1195.694784] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] result = hub.switch() [ 1195.694784] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1195.694784] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] return self.greenlet.switch() [ 1195.695101] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1195.695101] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] self.f(*self.args, **self.kw) [ 1195.695101] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1195.695101] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] raise exceptions.translate_fault(task_info.error) [ 1195.695101] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1195.695101] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Faults: ['InvalidArgument'] [ 1195.695101] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] [ 1195.695101] env[59518]: INFO nova.compute.manager [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Terminating instance [ 1195.696594] env[59518]: DEBUG oslo_concurrency.lockutils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1195.696837] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1195.697094] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b72871b6-93ed-4769-a240-59c6d7b364bb {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1195.699378] env[59518]: DEBUG nova.compute.manager [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1195.699592] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1195.700422] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc12cee1-5d50-4941-ab3e-31d7a9d2c980 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1195.707016] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1195.707210] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4b212f39-1dde-4515-a681-9a90e87dafc7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1195.709303] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1195.709499] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1195.710389] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c30dd329-96e0-4abf-a486-3ffba5dc7818 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1195.714843] env[59518]: DEBUG oslo_vmware.api [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Waiting for the task: (returnval){ [ 1195.714843] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]5207b0ce-577f-bc76-4088-0f6f6ce624ed" [ 1195.714843] env[59518]: _type = "Task" [ 1195.714843] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1195.721700] env[59518]: DEBUG oslo_vmware.api [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]5207b0ce-577f-bc76-4088-0f6f6ce624ed, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1195.769625] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1195.769844] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1195.770017] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Deleting the datastore file [datastore1] 3e59b5d7-978d-405a-b68a-47ee03b9a713 {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1195.770254] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cdfdee14-5458-4702-bc69-ebc235da3c93 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1195.775591] env[59518]: DEBUG oslo_vmware.api [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Waiting for the task: (returnval){ [ 1195.775591] env[59518]: value = "task-308012" [ 1195.775591] env[59518]: _type = "Task" [ 1195.775591] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1195.782707] env[59518]: DEBUG oslo_vmware.api [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Task: {'id': task-308012, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1196.230392] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1196.230805] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Creating directory with path [datastore1] vmware_temp/7a78a517-dde5-491d-ba4d-a96b82654a19/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1196.231013] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-235b1f17-bd92-49c8-87fb-b0b68058048f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1196.242585] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Created directory with path [datastore1] vmware_temp/7a78a517-dde5-491d-ba4d-a96b82654a19/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1196.242853] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Fetch image to [datastore1] vmware_temp/7a78a517-dde5-491d-ba4d-a96b82654a19/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1196.243106] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/7a78a517-dde5-491d-ba4d-a96b82654a19/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1196.244219] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f92ae80-575b-46d5-959d-260216eba89f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1196.252945] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43c9b047-ce8d-47d6-ab2a-d65a266fef6f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1196.265551] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e29556e6-9de1-4e30-8f6e-707a22ec2045 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1196.315705] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08e5fa30-52a5-4581-9c91-48641707d752 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1196.324752] env[59518]: DEBUG oslo_vmware.api [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Task: {'id': task-308012, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064625} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1196.326588] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1196.326851] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1196.327097] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1196.327340] env[59518]: INFO nova.compute.manager [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1196.330038] env[59518]: DEBUG nova.compute.claims [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1196.330267] env[59518]: DEBUG oslo_concurrency.lockutils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1196.330559] env[59518]: DEBUG oslo_concurrency.lockutils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1196.334075] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-181a80b0-ceb4-4b92-9014-e81324d87bd9 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1196.355499] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1196.402682] env[59518]: DEBUG oslo_vmware.rw_handles [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7a78a517-dde5-491d-ba4d-a96b82654a19/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1196.456802] env[59518]: DEBUG oslo_vmware.rw_handles [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Completed reading data from the image iterator. {{(pid=59518) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1196.456978] env[59518]: DEBUG oslo_vmware.rw_handles [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7a78a517-dde5-491d-ba4d-a96b82654a19/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1196.486012] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6043bf1-cc85-458f-aa4e-cfa963d5f47f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1196.493251] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ceb5bdc-9cae-47a9-89d2-31fcd59015cf {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1196.522605] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43c6974b-f0bd-44a5-8111-31716b537bd9 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1196.529424] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4c6b17a-67ae-4cf1-a1c3-c720883d50b2 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1196.542289] env[59518]: DEBUG nova.compute.provider_tree [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1196.550406] env[59518]: DEBUG nova.scheduler.client.report [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1196.563246] env[59518]: DEBUG oslo_concurrency.lockutils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.233s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1196.563746] env[59518]: ERROR nova.compute.manager [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1196.563746] env[59518]: Faults: ['InvalidArgument'] [ 1196.563746] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Traceback (most recent call last): [ 1196.563746] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1196.563746] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] self.driver.spawn(context, instance, image_meta, [ 1196.563746] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1196.563746] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1196.563746] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1196.563746] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] self._fetch_image_if_missing(context, vi) [ 1196.563746] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1196.563746] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] image_cache(vi, tmp_image_ds_loc) [ 1196.563746] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1196.564082] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] vm_util.copy_virtual_disk( [ 1196.564082] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1196.564082] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] session._wait_for_task(vmdk_copy_task) [ 1196.564082] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1196.564082] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] return self.wait_for_task(task_ref) [ 1196.564082] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1196.564082] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] return evt.wait() [ 1196.564082] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1196.564082] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] result = hub.switch() [ 1196.564082] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1196.564082] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] return self.greenlet.switch() [ 1196.564082] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1196.564082] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] self.f(*self.args, **self.kw) [ 1196.564384] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1196.564384] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] raise exceptions.translate_fault(task_info.error) [ 1196.564384] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1196.564384] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Faults: ['InvalidArgument'] [ 1196.564384] env[59518]: ERROR nova.compute.manager [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] [ 1196.564511] env[59518]: DEBUG nova.compute.utils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] VimFaultException {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1196.565784] env[59518]: DEBUG nova.compute.manager [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Build of instance 3e59b5d7-978d-405a-b68a-47ee03b9a713 was re-scheduled: A specified parameter was not correct: fileType [ 1196.565784] env[59518]: Faults: ['InvalidArgument'] {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1196.566143] env[59518]: DEBUG nova.compute.manager [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1196.566307] env[59518]: DEBUG nova.compute.manager [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1196.566493] env[59518]: DEBUG nova.compute.manager [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1196.566623] env[59518]: DEBUG nova.network.neutron [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1196.827881] env[59518]: DEBUG nova.network.neutron [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1196.838903] env[59518]: INFO nova.compute.manager [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Took 0.27 seconds to deallocate network for instance. [ 1196.922457] env[59518]: INFO nova.scheduler.client.report [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Deleted allocations for instance 3e59b5d7-978d-405a-b68a-47ee03b9a713 [ 1196.938450] env[59518]: DEBUG oslo_concurrency.lockutils [None req-034633c6-f75b-4063-be6a-f0b0f54c6d34 tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Lock "3e59b5d7-978d-405a-b68a-47ee03b9a713" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 576.561s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1196.938706] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c70b28d4-2bd1-44f2-8366-412347f0cf9d tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Lock "3e59b5d7-978d-405a-b68a-47ee03b9a713" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 378.595s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1196.938941] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c70b28d4-2bd1-44f2-8366-412347f0cf9d tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Acquiring lock "3e59b5d7-978d-405a-b68a-47ee03b9a713-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1196.939147] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c70b28d4-2bd1-44f2-8366-412347f0cf9d tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Lock "3e59b5d7-978d-405a-b68a-47ee03b9a713-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1196.939667] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c70b28d4-2bd1-44f2-8366-412347f0cf9d tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Lock "3e59b5d7-978d-405a-b68a-47ee03b9a713-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1196.941381] env[59518]: INFO nova.compute.manager [None req-c70b28d4-2bd1-44f2-8366-412347f0cf9d tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Terminating instance [ 1196.942980] env[59518]: DEBUG nova.compute.manager [None req-c70b28d4-2bd1-44f2-8366-412347f0cf9d tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1196.943166] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-c70b28d4-2bd1-44f2-8366-412347f0cf9d tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1196.943621] env[59518]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8141fd53-295a-45c6-bfa5-9d2851751e2f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1196.952608] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16feacbb-ca73-46d7-b285-a6c0a867256d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1196.979511] env[59518]: WARNING nova.virt.vmwareapi.vmops [None req-c70b28d4-2bd1-44f2-8366-412347f0cf9d tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3e59b5d7-978d-405a-b68a-47ee03b9a713 could not be found. [ 1196.979511] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-c70b28d4-2bd1-44f2-8366-412347f0cf9d tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1196.979656] env[59518]: INFO nova.compute.manager [None req-c70b28d4-2bd1-44f2-8366-412347f0cf9d tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1196.979911] env[59518]: DEBUG oslo.service.loopingcall [None req-c70b28d4-2bd1-44f2-8366-412347f0cf9d tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1196.980133] env[59518]: DEBUG nova.compute.manager [-] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1196.980232] env[59518]: DEBUG nova.network.neutron [-] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1197.000721] env[59518]: DEBUG nova.network.neutron [-] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1197.008751] env[59518]: INFO nova.compute.manager [-] [instance: 3e59b5d7-978d-405a-b68a-47ee03b9a713] Took 0.03 seconds to deallocate network for instance. [ 1197.084442] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c70b28d4-2bd1-44f2-8366-412347f0cf9d tempest-AttachInterfacesTestJSON-1153910947 tempest-AttachInterfacesTestJSON-1153910947-project-member] Lock "3e59b5d7-978d-405a-b68a-47ee03b9a713" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.146s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1210.564687] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._sync_power_states {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1210.583653] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Getting list of instances from cluster (obj){ [ 1210.583653] env[59518]: value = "domain-c8" [ 1210.583653] env[59518]: _type = "ClusterComputeResource" [ 1210.583653] env[59518]: } {{(pid=59518) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1210.584937] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afb585fe-3f13-48b6-a5b7-394d0fbd5f4f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1210.601698] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Got total of 9 instances {{(pid=59518) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1210.601860] env[59518]: WARNING nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] While synchronizing instance power states, found 4 instances in the database and 9 instances on the hypervisor. [ 1210.601952] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Triggering sync for uuid af4b8dd9-a05d-427e-a147-76c7cfec5862 {{(pid=59518) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 1210.602149] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Triggering sync for uuid ae88d565-bbf5-4c29-aee9-364c23086de5 {{(pid=59518) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 1210.602394] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Triggering sync for uuid 468e2dc5-6a66-401d-b6cd-06bb94cea0ef {{(pid=59518) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 1210.602563] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Triggering sync for uuid 8b692644-9080-4ffd-89b3-c8cd64de0e4f {{(pid=59518) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 1210.602915] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "af4b8dd9-a05d-427e-a147-76c7cfec5862" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1210.603193] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "ae88d565-bbf5-4c29-aee9-364c23086de5" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1210.603411] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "468e2dc5-6a66-401d-b6cd-06bb94cea0ef" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1210.603597] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "8b692644-9080-4ffd-89b3-c8cd64de0e4f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1217.486623] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1220.448161] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1224.448615] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1224.448903] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Starting heal instance info cache {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1224.449124] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Rebuilding the list of instances to heal {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1224.462869] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1224.463032] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1224.463143] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1224.463268] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1224.463389] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Didn't find any instances for network info cache update. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1225.447395] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1226.448748] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1226.449148] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1228.449707] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1228.449707] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59518) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1228.449707] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1228.458813] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1228.459019] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1228.459193] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1228.459357] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59518) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1228.460743] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-766ff43f-5a18-4f65-9234-d1898dba61d3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1228.468226] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed2b6445-4133-4ba7-b5df-cf794dcd911a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1228.481717] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba2157da-1d53-4d75-ba02-2336efcf8978 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1228.487516] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a965c38-2fa9-4f7b-b6e0-d47a86dbadc5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1228.516691] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181733MB free_disk=174GB free_vcpus=48 pci_devices=None {{(pid=59518) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1228.516857] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1228.517014] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1228.564282] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance af4b8dd9-a05d-427e-a147-76c7cfec5862 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1228.564430] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance ae88d565-bbf5-4c29-aee9-364c23086de5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1228.564553] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 468e2dc5-6a66-401d-b6cd-06bb94cea0ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1228.564673] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 8b692644-9080-4ffd-89b3-c8cd64de0e4f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1228.564839] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1228.564969] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1228.618679] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d397eea3-ea9e-45cf-9f82-bbfa14d41f3b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1228.626143] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d37de3b1-d2bc-498a-8a75-e3c1cfb5b105 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1228.655509] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99bab611-1958-48eb-8f1b-30bc5c654e02 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1228.662497] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74061763-4519-4064-b8c1-f10b5b5d41e7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1228.674990] env[59518]: DEBUG nova.compute.provider_tree [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1228.683208] env[59518]: DEBUG nova.scheduler.client.report [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1228.699900] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59518) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1228.700053] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.183s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1229.694757] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1242.278723] env[59518]: WARNING oslo_vmware.rw_handles [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1242.278723] env[59518]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1242.278723] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1242.278723] env[59518]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1242.278723] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1242.278723] env[59518]: ERROR oslo_vmware.rw_handles response.begin() [ 1242.278723] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1242.278723] env[59518]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1242.278723] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1242.278723] env[59518]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1242.278723] env[59518]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1242.278723] env[59518]: ERROR oslo_vmware.rw_handles [ 1242.279720] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Downloaded image file data e70539a9-144d-4900-807e-914ae0cc8539 to vmware_temp/7a78a517-dde5-491d-ba4d-a96b82654a19/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1242.281007] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Caching image {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1242.281295] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Copying Virtual Disk [datastore1] vmware_temp/7a78a517-dde5-491d-ba4d-a96b82654a19/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk to [datastore1] vmware_temp/7a78a517-dde5-491d-ba4d-a96b82654a19/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk {{(pid=59518) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1242.281906] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c664c6bb-4407-406f-923f-1523f9eb0dc9 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.289297] env[59518]: DEBUG oslo_vmware.api [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Waiting for the task: (returnval){ [ 1242.289297] env[59518]: value = "task-308013" [ 1242.289297] env[59518]: _type = "Task" [ 1242.289297] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1242.296469] env[59518]: DEBUG oslo_vmware.api [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Task: {'id': task-308013, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1242.798990] env[59518]: DEBUG oslo_vmware.exceptions [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Fault InvalidArgument not matched. {{(pid=59518) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1242.799283] env[59518]: DEBUG oslo_concurrency.lockutils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1242.799759] env[59518]: ERROR nova.compute.manager [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1242.799759] env[59518]: Faults: ['InvalidArgument'] [ 1242.799759] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Traceback (most recent call last): [ 1242.799759] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1242.799759] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] yield resources [ 1242.799759] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1242.799759] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] self.driver.spawn(context, instance, image_meta, [ 1242.799759] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1242.799759] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1242.799759] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1242.799759] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] self._fetch_image_if_missing(context, vi) [ 1242.799759] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1242.800340] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] image_cache(vi, tmp_image_ds_loc) [ 1242.800340] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1242.800340] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] vm_util.copy_virtual_disk( [ 1242.800340] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1242.800340] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] session._wait_for_task(vmdk_copy_task) [ 1242.800340] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1242.800340] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] return self.wait_for_task(task_ref) [ 1242.800340] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1242.800340] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] return evt.wait() [ 1242.800340] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1242.800340] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] result = hub.switch() [ 1242.800340] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1242.800340] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] return self.greenlet.switch() [ 1242.801056] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1242.801056] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] self.f(*self.args, **self.kw) [ 1242.801056] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1242.801056] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] raise exceptions.translate_fault(task_info.error) [ 1242.801056] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1242.801056] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Faults: ['InvalidArgument'] [ 1242.801056] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] [ 1242.801056] env[59518]: INFO nova.compute.manager [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Terminating instance [ 1242.801602] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1242.801808] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1242.802038] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cfc6fc56-4769-473e-841d-39e4417d1dfc {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.804342] env[59518]: DEBUG nova.compute.manager [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1242.804523] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1242.805204] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b21f7c9c-3982-4a3b-b82d-d9e9100a9589 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.811533] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1242.811726] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-52a385dd-c9eb-4344-a443-0af3cb611e8f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.813737] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1242.813900] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1242.814776] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e4db516e-e30f-41d5-993e-cdb53444f5f9 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.819358] env[59518]: DEBUG oslo_vmware.api [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Waiting for the task: (returnval){ [ 1242.819358] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52b5a35c-e00d-85a2-0a26-0b60e56c6303" [ 1242.819358] env[59518]: _type = "Task" [ 1242.819358] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1242.833288] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1242.833496] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Creating directory with path [datastore1] vmware_temp/9d0de1fd-94e5-4eb6-bf3a-8666bb9bff25/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1242.833685] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-53d2f4b6-4a03-4b59-a249-fa42250b6987 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.852722] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Created directory with path [datastore1] vmware_temp/9d0de1fd-94e5-4eb6-bf3a-8666bb9bff25/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1242.852899] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Fetch image to [datastore1] vmware_temp/9d0de1fd-94e5-4eb6-bf3a-8666bb9bff25/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1242.853057] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/9d0de1fd-94e5-4eb6-bf3a-8666bb9bff25/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1242.853814] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9133612b-2747-46d1-86cb-f259f6821998 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.860200] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1459931f-1298-4757-8870-73937593d828 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.868742] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ab7bdcb-9061-44b2-9fed-a38607172cbd {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.902679] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a0f0dc8-48b7-4fa1-b8af-4e9ed5cc7a77 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.905096] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1242.905284] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1242.905447] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Deleting the datastore file [datastore1] af4b8dd9-a05d-427e-a147-76c7cfec5862 {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1242.905725] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-db2e0703-5990-4dfb-8af7-d09d3f756beb {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.912015] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-112bfb86-565b-46b5-bbdf-4c8fec3812bb {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.913638] env[59518]: DEBUG oslo_vmware.api [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Waiting for the task: (returnval){ [ 1242.913638] env[59518]: value = "task-308015" [ 1242.913638] env[59518]: _type = "Task" [ 1242.913638] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1242.920591] env[59518]: DEBUG oslo_vmware.api [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Task: {'id': task-308015, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1242.930860] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1243.103759] env[59518]: DEBUG oslo_vmware.rw_handles [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9d0de1fd-94e5-4eb6-bf3a-8666bb9bff25/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1243.157404] env[59518]: DEBUG oslo_vmware.rw_handles [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Completed reading data from the image iterator. {{(pid=59518) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1243.157576] env[59518]: DEBUG oslo_vmware.rw_handles [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9d0de1fd-94e5-4eb6-bf3a-8666bb9bff25/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1243.423416] env[59518]: DEBUG oslo_vmware.api [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Task: {'id': task-308015, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070133} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1243.423809] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1243.423864] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1243.424023] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1243.424197] env[59518]: INFO nova.compute.manager [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1243.426286] env[59518]: DEBUG nova.compute.claims [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1243.426494] env[59518]: DEBUG oslo_concurrency.lockutils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1243.426733] env[59518]: DEBUG oslo_concurrency.lockutils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1243.513986] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f29a9d53-a5f0-4d3b-aa6b-9199bc220476 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1243.521665] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6242bb5f-ed17-47c5-8407-f3cf1c712573 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1243.550971] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fef9bd9-1683-44d9-a422-099a6ddddf64 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1243.557852] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3371c52d-461e-4913-8bd8-26dae7fcc370 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1243.570596] env[59518]: DEBUG nova.compute.provider_tree [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1243.578733] env[59518]: DEBUG nova.scheduler.client.report [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1243.591109] env[59518]: DEBUG oslo_concurrency.lockutils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.164s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1243.591633] env[59518]: ERROR nova.compute.manager [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1243.591633] env[59518]: Faults: ['InvalidArgument'] [ 1243.591633] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Traceback (most recent call last): [ 1243.591633] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1243.591633] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] self.driver.spawn(context, instance, image_meta, [ 1243.591633] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1243.591633] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1243.591633] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1243.591633] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] self._fetch_image_if_missing(context, vi) [ 1243.591633] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1243.591633] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] image_cache(vi, tmp_image_ds_loc) [ 1243.591633] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1243.592024] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] vm_util.copy_virtual_disk( [ 1243.592024] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1243.592024] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] session._wait_for_task(vmdk_copy_task) [ 1243.592024] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1243.592024] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] return self.wait_for_task(task_ref) [ 1243.592024] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1243.592024] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] return evt.wait() [ 1243.592024] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1243.592024] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] result = hub.switch() [ 1243.592024] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1243.592024] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] return self.greenlet.switch() [ 1243.592024] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1243.592024] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] self.f(*self.args, **self.kw) [ 1243.592392] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1243.592392] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] raise exceptions.translate_fault(task_info.error) [ 1243.592392] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1243.592392] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Faults: ['InvalidArgument'] [ 1243.592392] env[59518]: ERROR nova.compute.manager [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] [ 1243.592392] env[59518]: DEBUG nova.compute.utils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] VimFaultException {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1243.593920] env[59518]: DEBUG nova.compute.manager [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Build of instance af4b8dd9-a05d-427e-a147-76c7cfec5862 was re-scheduled: A specified parameter was not correct: fileType [ 1243.593920] env[59518]: Faults: ['InvalidArgument'] {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1243.594286] env[59518]: DEBUG nova.compute.manager [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1243.594451] env[59518]: DEBUG nova.compute.manager [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1243.594611] env[59518]: DEBUG nova.compute.manager [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1243.594767] env[59518]: DEBUG nova.network.neutron [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1243.877790] env[59518]: DEBUG nova.network.neutron [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1243.890393] env[59518]: INFO nova.compute.manager [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Took 0.30 seconds to deallocate network for instance. [ 1243.975691] env[59518]: INFO nova.scheduler.client.report [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Deleted allocations for instance af4b8dd9-a05d-427e-a147-76c7cfec5862 [ 1243.993508] env[59518]: DEBUG oslo_concurrency.lockutils [None req-32530df9-5803-4d52-bb86-e7d79fa607f1 tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Lock "af4b8dd9-a05d-427e-a147-76c7cfec5862" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 623.038s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1243.994121] env[59518]: DEBUG oslo_concurrency.lockutils [None req-97aac32e-ac6f-40ba-848a-2ac12c57224d tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Lock "af4b8dd9-a05d-427e-a147-76c7cfec5862" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 425.707s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1243.994334] env[59518]: DEBUG oslo_concurrency.lockutils [None req-97aac32e-ac6f-40ba-848a-2ac12c57224d tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Acquiring lock "af4b8dd9-a05d-427e-a147-76c7cfec5862-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1243.994586] env[59518]: DEBUG oslo_concurrency.lockutils [None req-97aac32e-ac6f-40ba-848a-2ac12c57224d tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Lock "af4b8dd9-a05d-427e-a147-76c7cfec5862-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1243.994675] env[59518]: DEBUG oslo_concurrency.lockutils [None req-97aac32e-ac6f-40ba-848a-2ac12c57224d tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Lock "af4b8dd9-a05d-427e-a147-76c7cfec5862-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1243.996988] env[59518]: INFO nova.compute.manager [None req-97aac32e-ac6f-40ba-848a-2ac12c57224d tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Terminating instance [ 1243.999087] env[59518]: DEBUG nova.compute.manager [None req-97aac32e-ac6f-40ba-848a-2ac12c57224d tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1243.999304] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-97aac32e-ac6f-40ba-848a-2ac12c57224d tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1243.999563] env[59518]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5f87f7e7-ed05-4dc8-9c7b-183dd6f355b3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1244.008575] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3d0ef81-7b2a-44ce-936a-8d29d77a24c5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1244.036245] env[59518]: WARNING nova.virt.vmwareapi.vmops [None req-97aac32e-ac6f-40ba-848a-2ac12c57224d tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance af4b8dd9-a05d-427e-a147-76c7cfec5862 could not be found. [ 1244.037789] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-97aac32e-ac6f-40ba-848a-2ac12c57224d tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1244.037789] env[59518]: INFO nova.compute.manager [None req-97aac32e-ac6f-40ba-848a-2ac12c57224d tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1244.037789] env[59518]: DEBUG oslo.service.loopingcall [None req-97aac32e-ac6f-40ba-848a-2ac12c57224d tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1244.037789] env[59518]: DEBUG nova.compute.manager [-] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1244.037789] env[59518]: DEBUG nova.network.neutron [-] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1244.066864] env[59518]: DEBUG nova.network.neutron [-] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1244.074320] env[59518]: INFO nova.compute.manager [-] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] Took 0.04 seconds to deallocate network for instance. [ 1244.153027] env[59518]: DEBUG oslo_concurrency.lockutils [None req-97aac32e-ac6f-40ba-848a-2ac12c57224d tempest-ServerDiskConfigTestJSON-1589077293 tempest-ServerDiskConfigTestJSON-1589077293-project-member] Lock "af4b8dd9-a05d-427e-a147-76c7cfec5862" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.159s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1244.153823] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "af4b8dd9-a05d-427e-a147-76c7cfec5862" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 33.551s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1244.153999] env[59518]: INFO nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: af4b8dd9-a05d-427e-a147-76c7cfec5862] During sync_power_state the instance has a pending task (deleting). Skip. [ 1244.154196] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "af4b8dd9-a05d-427e-a147-76c7cfec5862" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1250.358267] env[59518]: DEBUG nova.compute.manager [req-9de8dd7e-b367-4b56-9e06-7f2fbbb72afa req-7b8af607-fbfa-4dfc-80a2-eda165e617b0 service nova] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Received event network-vif-deleted-1eeb9a90-e052-4a24-a64f-a310a82b6cde {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1277.448114] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1281.448796] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1285.450312] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1285.450312] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Starting heal instance info cache {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1285.450312] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Rebuilding the list of instances to heal {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1285.461396] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1285.461545] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1285.461667] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Didn't find any instances for network info cache update. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1287.448602] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1288.443923] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1288.455872] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1288.456215] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1289.065055] env[59518]: WARNING oslo_vmware.rw_handles [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1289.065055] env[59518]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1289.065055] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1289.065055] env[59518]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1289.065055] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1289.065055] env[59518]: ERROR oslo_vmware.rw_handles response.begin() [ 1289.065055] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1289.065055] env[59518]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1289.065055] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1289.065055] env[59518]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1289.065055] env[59518]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1289.065055] env[59518]: ERROR oslo_vmware.rw_handles [ 1289.065490] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Downloaded image file data e70539a9-144d-4900-807e-914ae0cc8539 to vmware_temp/9d0de1fd-94e5-4eb6-bf3a-8666bb9bff25/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1289.067224] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Caching image {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1289.067459] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Copying Virtual Disk [datastore1] vmware_temp/9d0de1fd-94e5-4eb6-bf3a-8666bb9bff25/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk to [datastore1] vmware_temp/9d0de1fd-94e5-4eb6-bf3a-8666bb9bff25/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk {{(pid=59518) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1289.067720] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3e1f5e5e-037d-480c-b482-627a5a7460db {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1289.077325] env[59518]: DEBUG oslo_vmware.api [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Waiting for the task: (returnval){ [ 1289.077325] env[59518]: value = "task-308016" [ 1289.077325] env[59518]: _type = "Task" [ 1289.077325] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1289.084854] env[59518]: DEBUG oslo_vmware.api [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Task: {'id': task-308016, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1289.447813] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1289.448131] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1289.448334] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59518) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1289.587438] env[59518]: DEBUG oslo_vmware.exceptions [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Fault InvalidArgument not matched. {{(pid=59518) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1289.587854] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1289.588332] env[59518]: ERROR nova.compute.manager [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1289.588332] env[59518]: Faults: ['InvalidArgument'] [ 1289.588332] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Traceback (most recent call last): [ 1289.588332] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1289.588332] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] yield resources [ 1289.588332] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1289.588332] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] self.driver.spawn(context, instance, image_meta, [ 1289.588332] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1289.588332] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1289.588332] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1289.588332] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] self._fetch_image_if_missing(context, vi) [ 1289.588332] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1289.588680] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] image_cache(vi, tmp_image_ds_loc) [ 1289.588680] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1289.588680] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] vm_util.copy_virtual_disk( [ 1289.588680] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1289.588680] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] session._wait_for_task(vmdk_copy_task) [ 1289.588680] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1289.588680] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] return self.wait_for_task(task_ref) [ 1289.588680] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1289.588680] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] return evt.wait() [ 1289.588680] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1289.588680] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] result = hub.switch() [ 1289.588680] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1289.588680] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] return self.greenlet.switch() [ 1289.589014] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1289.589014] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] self.f(*self.args, **self.kw) [ 1289.589014] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1289.589014] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] raise exceptions.translate_fault(task_info.error) [ 1289.589014] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1289.589014] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Faults: ['InvalidArgument'] [ 1289.589014] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] [ 1289.589014] env[59518]: INFO nova.compute.manager [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Terminating instance [ 1289.590318] env[59518]: DEBUG oslo_concurrency.lockutils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1289.590554] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1289.590922] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-24716226-59c0-4938-8915-b322883d3938 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1289.593080] env[59518]: DEBUG nova.compute.manager [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1289.593293] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1289.594086] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1344833-39b9-4b59-bbad-40bbc1f1cd13 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1289.600628] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1289.600859] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d855995f-3a3d-4e1a-beff-c09306cb46e5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1289.602906] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1289.603077] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1289.603969] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-749f7ddf-c8d1-4b1e-ac97-f321447a82d7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1289.608242] env[59518]: DEBUG oslo_vmware.api [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Waiting for the task: (returnval){ [ 1289.608242] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52c11b61-85b5-7b4f-0afe-9a22c6063ff2" [ 1289.608242] env[59518]: _type = "Task" [ 1289.608242] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1289.615041] env[59518]: DEBUG oslo_vmware.api [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52c11b61-85b5-7b4f-0afe-9a22c6063ff2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1289.663545] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1289.663729] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1289.663903] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Deleting the datastore file [datastore1] ae88d565-bbf5-4c29-aee9-364c23086de5 {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1289.664187] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1d57c529-4907-4d49-bc2c-1a6d50006b3e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1289.670336] env[59518]: DEBUG oslo_vmware.api [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Waiting for the task: (returnval){ [ 1289.670336] env[59518]: value = "task-308018" [ 1289.670336] env[59518]: _type = "Task" [ 1289.670336] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1289.677471] env[59518]: DEBUG oslo_vmware.api [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Task: {'id': task-308018, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1290.118429] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1290.118703] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Creating directory with path [datastore1] vmware_temp/c1b71a21-25d3-42f2-bc9b-b780f65b3ba1/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1290.118900] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2a8d2aac-6b2e-4fa7-8758-7bd9148283dc {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.129525] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Created directory with path [datastore1] vmware_temp/c1b71a21-25d3-42f2-bc9b-b780f65b3ba1/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1290.129734] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Fetch image to [datastore1] vmware_temp/c1b71a21-25d3-42f2-bc9b-b780f65b3ba1/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1290.129869] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/c1b71a21-25d3-42f2-bc9b-b780f65b3ba1/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1290.130578] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-270f14a3-88bc-44bb-b297-049094237bbb {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.136843] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55e32ac8-2ef4-433c-b491-8c0a07dca0a5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.145361] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2a0c215-7a50-46f7-9d7a-ca30a256e9bf {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.178652] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e89db80e-18a9-430e-add5-65d327ad809c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.185352] env[59518]: DEBUG oslo_vmware.api [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Task: {'id': task-308018, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074128} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1290.186774] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1290.186958] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1290.187113] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1290.187276] env[59518]: INFO nova.compute.manager [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1290.188996] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8e1ff7b1-de1a-4f7d-865c-08bb5e1bada3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.190795] env[59518]: DEBUG nova.compute.claims [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1290.190960] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1290.191209] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1290.212245] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1290.262751] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d93517e-4bd5-4b8c-ae3d-a1a336fa1f0b {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.270005] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8edfb4eb-5a09-459a-ba63-6317db11b951 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.301042] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-203dc895-0ba3-4299-9ec5-838f087050b8 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.307782] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bf34521-2c51-4e43-8b24-a96f5aa39793 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.320296] env[59518]: DEBUG nova.compute.provider_tree [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1290.328118] env[59518]: DEBUG nova.scheduler.client.report [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1290.340980] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.150s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1290.341498] env[59518]: ERROR nova.compute.manager [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1290.341498] env[59518]: Faults: ['InvalidArgument'] [ 1290.341498] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Traceback (most recent call last): [ 1290.341498] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1290.341498] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] self.driver.spawn(context, instance, image_meta, [ 1290.341498] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1290.341498] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1290.341498] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1290.341498] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] self._fetch_image_if_missing(context, vi) [ 1290.341498] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1290.341498] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] image_cache(vi, tmp_image_ds_loc) [ 1290.341498] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1290.341860] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] vm_util.copy_virtual_disk( [ 1290.341860] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1290.341860] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] session._wait_for_task(vmdk_copy_task) [ 1290.341860] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1290.341860] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] return self.wait_for_task(task_ref) [ 1290.341860] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1290.341860] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] return evt.wait() [ 1290.341860] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1290.341860] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] result = hub.switch() [ 1290.341860] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1290.341860] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] return self.greenlet.switch() [ 1290.341860] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1290.341860] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] self.f(*self.args, **self.kw) [ 1290.342220] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1290.342220] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] raise exceptions.translate_fault(task_info.error) [ 1290.342220] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1290.342220] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Faults: ['InvalidArgument'] [ 1290.342220] env[59518]: ERROR nova.compute.manager [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] [ 1290.342220] env[59518]: DEBUG nova.compute.utils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] VimFaultException {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1290.343545] env[59518]: DEBUG nova.compute.manager [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Build of instance ae88d565-bbf5-4c29-aee9-364c23086de5 was re-scheduled: A specified parameter was not correct: fileType [ 1290.343545] env[59518]: Faults: ['InvalidArgument'] {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1290.343907] env[59518]: DEBUG nova.compute.manager [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1290.344084] env[59518]: DEBUG nova.compute.manager [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1290.344249] env[59518]: DEBUG nova.compute.manager [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1290.344408] env[59518]: DEBUG nova.network.neutron [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1290.413098] env[59518]: DEBUG oslo_concurrency.lockutils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1290.414720] env[59518]: ERROR nova.compute.manager [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image e70539a9-144d-4900-807e-914ae0cc8539. [ 1290.414720] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] Traceback (most recent call last): [ 1290.414720] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1290.414720] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1290.414720] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1290.414720] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] result = getattr(controller, method)(*args, **kwargs) [ 1290.414720] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1290.414720] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return self._get(image_id) [ 1290.414720] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1290.414720] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1290.414720] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1290.415031] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] resp, body = self.http_client.get(url, headers=header) [ 1290.415031] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1290.415031] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return self.request(url, 'GET', **kwargs) [ 1290.415031] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1290.415031] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return self._handle_response(resp) [ 1290.415031] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1290.415031] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] raise exc.from_response(resp, resp.content) [ 1290.415031] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1290.415031] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1290.415031] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] During handling of the above exception, another exception occurred: [ 1290.415031] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1290.415031] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] Traceback (most recent call last): [ 1290.415410] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1290.415410] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] yield resources [ 1290.415410] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1290.415410] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] self.driver.spawn(context, instance, image_meta, [ 1290.415410] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1290.415410] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1290.415410] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1290.415410] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] self._fetch_image_if_missing(context, vi) [ 1290.415410] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1290.415410] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] image_fetch(context, vi, tmp_image_ds_loc) [ 1290.415410] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1290.415410] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] images.fetch_image( [ 1290.415410] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1290.415767] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] metadata = IMAGE_API.get(context, image_ref) [ 1290.415767] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1290.415767] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return session.show(context, image_id, [ 1290.415767] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1290.415767] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] _reraise_translated_image_exception(image_id) [ 1290.415767] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1290.415767] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] raise new_exc.with_traceback(exc_trace) [ 1290.415767] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1290.415767] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1290.415767] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1290.415767] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] result = getattr(controller, method)(*args, **kwargs) [ 1290.415767] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1290.415767] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return self._get(image_id) [ 1290.416136] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1290.416136] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1290.416136] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1290.416136] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] resp, body = self.http_client.get(url, headers=header) [ 1290.416136] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1290.416136] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return self.request(url, 'GET', **kwargs) [ 1290.416136] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1290.416136] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return self._handle_response(resp) [ 1290.416136] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1290.416136] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] raise exc.from_response(resp, resp.content) [ 1290.416136] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] nova.exception.ImageNotAuthorized: Not authorized for image e70539a9-144d-4900-807e-914ae0cc8539. [ 1290.416136] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1290.416460] env[59518]: INFO nova.compute.manager [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Terminating instance [ 1290.416460] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1290.416676] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1290.417302] env[59518]: DEBUG nova.compute.manager [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1290.417477] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1290.417696] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3bb4e9ba-68c4-4de1-965b-3a8fed581d26 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.420211] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05ec6cfa-043b-4dcd-98b6-33bc9c6c612e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.428544] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1290.428751] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1a7f0c2a-bc3f-402f-9449-01b8adfafee8 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.430863] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1290.431026] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1290.431999] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8214fc07-0704-423b-a9b0-9cffe52c3c6d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.436484] env[59518]: DEBUG oslo_vmware.api [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Waiting for the task: (returnval){ [ 1290.436484] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52adba65-2882-9b41-081b-81d2f49c9a55" [ 1290.436484] env[59518]: _type = "Task" [ 1290.436484] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1290.449841] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1290.450844] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1290.451057] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Creating directory with path [datastore1] vmware_temp/31bb898c-31c8-44a6-b16e-ba0890792f7d/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1290.451504] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6e3618d9-8066-4e9b-8c12-80a03c0cc755 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.459167] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1290.459363] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1290.459528] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1290.459688] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59518) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1290.460624] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26c63bf9-0b88-4291-9388-9ff3c91860ce {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.471778] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Created directory with path [datastore1] vmware_temp/31bb898c-31c8-44a6-b16e-ba0890792f7d/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1290.472021] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Fetch image to [datastore1] vmware_temp/31bb898c-31c8-44a6-b16e-ba0890792f7d/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1290.472197] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/31bb898c-31c8-44a6-b16e-ba0890792f7d/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1290.473250] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ca961dd-439b-4c2b-abb0-cf1b020d9908 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.480922] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74f11980-bb56-4414-97d7-4059bcc30d90 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.487563] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75c41864-ae76-4cb2-a3bf-87eb680b9322 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.498934] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91120f22-ccea-403e-9641-04131f997f27 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.502189] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1290.502383] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1290.502548] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Deleting the datastore file [datastore1] ac3485ac-4817-4492-a196-331002b2cc46 {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1290.506688] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cc3e682d-0a97-442d-8fee-05496e0033c9 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.510484] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a171b0ef-d3da-4a24-b3c1-2bd0961c0c62 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.518242] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-612ff3a2-7bec-404e-9de8-5b22d76c1648 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.544843] env[59518]: DEBUG oslo_vmware.api [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Waiting for the task: (returnval){ [ 1290.544843] env[59518]: value = "task-308020" [ 1290.544843] env[59518]: _type = "Task" [ 1290.544843] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1290.546311] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-363fe8f3-7238-4d20-83dc-c42f7e4df4ea {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.574548] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181721MB free_disk=174GB free_vcpus=48 pci_devices=None {{(pid=59518) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1290.574702] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1290.574867] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1290.581786] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-30c3bdbe-cca8-4eb1-a232-b7a44ae7fd4d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.583489] env[59518]: DEBUG oslo_vmware.api [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Task: {'id': task-308020, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1290.602904] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1290.626591] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance ae88d565-bbf5-4c29-aee9-364c23086de5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1290.626750] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 8b692644-9080-4ffd-89b3-c8cd64de0e4f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1290.626928] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1290.627064] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1290.676192] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2dce995-e399-42b4-b87d-4962fda24952 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.680411] env[59518]: DEBUG nova.network.neutron [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1290.685599] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09912ced-8561-4443-ae62-3373f58c2345 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.719949] env[59518]: INFO nova.compute.manager [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Took 0.38 seconds to deallocate network for instance. [ 1290.724046] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1290.724923] env[59518]: ERROR nova.compute.manager [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image e70539a9-144d-4900-807e-914ae0cc8539. [ 1290.724923] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Traceback (most recent call last): [ 1290.724923] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1290.724923] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1290.724923] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1290.724923] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] result = getattr(controller, method)(*args, **kwargs) [ 1290.724923] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1290.724923] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return self._get(image_id) [ 1290.724923] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1290.724923] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1290.724923] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1290.725264] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] resp, body = self.http_client.get(url, headers=header) [ 1290.725264] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1290.725264] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return self.request(url, 'GET', **kwargs) [ 1290.725264] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1290.725264] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return self._handle_response(resp) [ 1290.725264] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1290.725264] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] raise exc.from_response(resp, resp.content) [ 1290.725264] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1290.725264] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1290.725264] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] During handling of the above exception, another exception occurred: [ 1290.725264] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1290.725264] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Traceback (most recent call last): [ 1290.725683] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1290.725683] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] yield resources [ 1290.725683] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1290.725683] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] self.driver.spawn(context, instance, image_meta, [ 1290.725683] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1290.725683] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1290.725683] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1290.725683] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] self._fetch_image_if_missing(context, vi) [ 1290.725683] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1290.725683] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] image_fetch(context, vi, tmp_image_ds_loc) [ 1290.725683] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1290.725683] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] images.fetch_image( [ 1290.725683] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1290.726189] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] metadata = IMAGE_API.get(context, image_ref) [ 1290.726189] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1290.726189] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return session.show(context, image_id, [ 1290.726189] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1290.726189] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] _reraise_translated_image_exception(image_id) [ 1290.726189] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1290.726189] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] raise new_exc.with_traceback(exc_trace) [ 1290.726189] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1290.726189] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1290.726189] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1290.726189] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] result = getattr(controller, method)(*args, **kwargs) [ 1290.726189] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1290.726189] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return self._get(image_id) [ 1290.726530] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1290.726530] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1290.726530] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1290.726530] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] resp, body = self.http_client.get(url, headers=header) [ 1290.726530] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1290.726530] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return self.request(url, 'GET', **kwargs) [ 1290.726530] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1290.726530] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return self._handle_response(resp) [ 1290.726530] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1290.726530] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] raise exc.from_response(resp, resp.content) [ 1290.726530] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] nova.exception.ImageNotAuthorized: Not authorized for image e70539a9-144d-4900-807e-914ae0cc8539. [ 1290.726530] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1290.726889] env[59518]: INFO nova.compute.manager [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Terminating instance [ 1290.728293] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0790f80-7829-48cf-9e3a-88e170001cfb {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.731162] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1290.731433] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1290.732039] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquiring lock "refresh_cache-7d4fa130-c399-4e8c-a711-33a08ed5dde9" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1290.732256] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquired lock "refresh_cache-7d4fa130-c399-4e8c-a711-33a08ed5dde9" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1290.732474] env[59518]: DEBUG nova.network.neutron [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1290.738881] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-957b2508-8826-417a-8219-b508643127f4 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.740529] env[59518]: DEBUG nova.compute.utils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Can not refresh info_cache because instance was not found {{(pid=59518) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 1290.748546] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efb229c8-3e6a-4f44-9523-412f00c9184a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.753313] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1290.753556] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1290.754808] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4f2da0a8-e286-4053-8ae2-821a67adc275 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.763766] env[59518]: DEBUG oslo_vmware.api [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Waiting for the task: (returnval){ [ 1290.763766] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52e86f7b-69bd-2e4f-6f8d-56118572f4b1" [ 1290.763766] env[59518]: _type = "Task" [ 1290.763766] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1290.772490] env[59518]: DEBUG nova.compute.provider_tree [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1290.783521] env[59518]: DEBUG nova.scheduler.client.report [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1290.786407] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1290.786635] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Creating directory with path [datastore1] vmware_temp/9ccc2da0-a8dc-412d-9095-0d01c5526ef5/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1290.787190] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-052714a1-0b5b-43f9-a3d1-e719eb2c8244 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.801196] env[59518]: DEBUG nova.network.neutron [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1290.804751] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59518) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1290.805028] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.230s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1290.808894] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Created directory with path [datastore1] vmware_temp/9ccc2da0-a8dc-412d-9095-0d01c5526ef5/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1290.809085] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Fetch image to [datastore1] vmware_temp/9ccc2da0-a8dc-412d-9095-0d01c5526ef5/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1290.809258] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/9ccc2da0-a8dc-412d-9095-0d01c5526ef5/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1290.810034] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1f537c0-01c1-4730-b8a8-6b35edaec0e6 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.818734] env[59518]: INFO nova.scheduler.client.report [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Deleted allocations for instance ae88d565-bbf5-4c29-aee9-364c23086de5 [ 1290.827668] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0aa5c37-a68d-4b25-9999-a06ff8ea5f9e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.837929] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c15c169f-e051-497f-a691-ac2fbe6968a6 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.842180] env[59518]: DEBUG oslo_concurrency.lockutils [None req-6e1b4319-8e20-46d0-b060-71223e445d72 tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Lock "ae88d565-bbf5-4c29-aee9-364c23086de5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 668.739s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1290.842616] env[59518]: DEBUG oslo_concurrency.lockutils [None req-49c7338e-23cb-44d3-9826-421416376e5d tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Lock "ae88d565-bbf5-4c29-aee9-364c23086de5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 470.649s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1290.842822] env[59518]: DEBUG oslo_concurrency.lockutils [None req-49c7338e-23cb-44d3-9826-421416376e5d tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Acquiring lock "ae88d565-bbf5-4c29-aee9-364c23086de5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1290.843009] env[59518]: DEBUG oslo_concurrency.lockutils [None req-49c7338e-23cb-44d3-9826-421416376e5d tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Lock "ae88d565-bbf5-4c29-aee9-364c23086de5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1290.843160] env[59518]: DEBUG oslo_concurrency.lockutils [None req-49c7338e-23cb-44d3-9826-421416376e5d tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Lock "ae88d565-bbf5-4c29-aee9-364c23086de5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1290.845546] env[59518]: INFO nova.compute.manager [None req-49c7338e-23cb-44d3-9826-421416376e5d tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Terminating instance [ 1290.847558] env[59518]: DEBUG nova.compute.manager [None req-49c7338e-23cb-44d3-9826-421416376e5d tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1290.847753] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-49c7338e-23cb-44d3-9826-421416376e5d tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1290.847977] env[59518]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-bbefc927-8e6f-41ea-8596-07fe494d57f6 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.875792] env[59518]: DEBUG nova.network.neutron [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1290.877697] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad62d558-a2d2-441a-8c3f-8767607d2ec3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.884070] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84cfd02f-e3c5-4b5e-a1c3-8f70861058df {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.896170] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Releasing lock "refresh_cache-7d4fa130-c399-4e8c-a711-33a08ed5dde9" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1290.896534] env[59518]: DEBUG nova.compute.manager [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1290.896710] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1290.897252] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d3973731-4f39-4141-8b2c-0851a2345b92 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.899506] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35210174-dd2c-416c-be89-942f8d24f914 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.906144] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1290.912496] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-33de3f83-4450-440d-a989-368ed5be44e7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.913873] env[59518]: WARNING nova.virt.vmwareapi.vmops [None req-49c7338e-23cb-44d3-9826-421416376e5d tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ae88d565-bbf5-4c29-aee9-364c23086de5 could not be found. [ 1290.914047] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-49c7338e-23cb-44d3-9826-421416376e5d tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1290.914210] env[59518]: INFO nova.compute.manager [None req-49c7338e-23cb-44d3-9826-421416376e5d tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Took 0.07 seconds to destroy the instance on the hypervisor. [ 1290.914438] env[59518]: DEBUG oslo.service.loopingcall [None req-49c7338e-23cb-44d3-9826-421416376e5d tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1290.914637] env[59518]: DEBUG nova.compute.manager [-] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1290.914727] env[59518]: DEBUG nova.network.neutron [-] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1290.921183] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1290.935547] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1290.935747] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1290.935913] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Deleting the datastore file [datastore1] 7d4fa130-c399-4e8c-a711-33a08ed5dde9 {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1290.936159] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ae8ab527-357d-4689-ba32-204f1ecb0046 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1290.938959] env[59518]: DEBUG nova.network.neutron [-] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1290.942178] env[59518]: DEBUG oslo_vmware.api [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Waiting for the task: (returnval){ [ 1290.942178] env[59518]: value = "task-308022" [ 1290.942178] env[59518]: _type = "Task" [ 1290.942178] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1290.949901] env[59518]: DEBUG oslo_vmware.api [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Task: {'id': task-308022, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1290.951860] env[59518]: INFO nova.compute.manager [-] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] Took 0.04 seconds to deallocate network for instance. [ 1291.020438] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1291.021237] env[59518]: ERROR nova.compute.manager [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image e70539a9-144d-4900-807e-914ae0cc8539. [ 1291.021237] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Traceback (most recent call last): [ 1291.021237] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1291.021237] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1291.021237] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1291.021237] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] result = getattr(controller, method)(*args, **kwargs) [ 1291.021237] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1291.021237] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return self._get(image_id) [ 1291.021237] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1291.021237] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1291.021237] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1291.021613] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] resp, body = self.http_client.get(url, headers=header) [ 1291.021613] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1291.021613] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return self.request(url, 'GET', **kwargs) [ 1291.021613] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1291.021613] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return self._handle_response(resp) [ 1291.021613] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1291.021613] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] raise exc.from_response(resp, resp.content) [ 1291.021613] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1291.021613] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.021613] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] During handling of the above exception, another exception occurred: [ 1291.021613] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.021613] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Traceback (most recent call last): [ 1291.021947] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1291.021947] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] yield resources [ 1291.021947] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1291.021947] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] self.driver.spawn(context, instance, image_meta, [ 1291.021947] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1291.021947] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1291.021947] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1291.021947] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] self._fetch_image_if_missing(context, vi) [ 1291.021947] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1291.021947] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] image_fetch(context, vi, tmp_image_ds_loc) [ 1291.021947] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1291.021947] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] images.fetch_image( [ 1291.021947] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1291.022353] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] metadata = IMAGE_API.get(context, image_ref) [ 1291.022353] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1291.022353] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return session.show(context, image_id, [ 1291.022353] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1291.022353] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] _reraise_translated_image_exception(image_id) [ 1291.022353] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1291.022353] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] raise new_exc.with_traceback(exc_trace) [ 1291.022353] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1291.022353] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1291.022353] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1291.022353] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] result = getattr(controller, method)(*args, **kwargs) [ 1291.022353] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1291.022353] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return self._get(image_id) [ 1291.022727] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1291.022727] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1291.022727] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1291.022727] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] resp, body = self.http_client.get(url, headers=header) [ 1291.022727] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1291.022727] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return self.request(url, 'GET', **kwargs) [ 1291.022727] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1291.022727] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return self._handle_response(resp) [ 1291.022727] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1291.022727] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] raise exc.from_response(resp, resp.content) [ 1291.022727] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] nova.exception.ImageNotAuthorized: Not authorized for image e70539a9-144d-4900-807e-914ae0cc8539. [ 1291.022727] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.023027] env[59518]: INFO nova.compute.manager [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Terminating instance [ 1291.023253] env[59518]: DEBUG oslo_concurrency.lockutils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1291.023461] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1291.024063] env[59518]: DEBUG nova.compute.manager [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1291.024245] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1291.024459] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cccfcec5-f424-479b-a609-ae3209a01c15 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1291.027319] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-500eec46-d9ab-43b0-9129-6000227cb86a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1291.036414] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1291.036813] env[59518]: DEBUG oslo_concurrency.lockutils [None req-49c7338e-23cb-44d3-9826-421416376e5d tempest-ImagesOneServerNegativeTestJSON-865905459 tempest-ImagesOneServerNegativeTestJSON-865905459-project-member] Lock "ae88d565-bbf5-4c29-aee9-364c23086de5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.194s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1291.038354] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ed053796-fe4b-437a-b259-f72fea72189c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1291.039791] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "ae88d565-bbf5-4c29-aee9-364c23086de5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 80.437s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1291.039959] env[59518]: INFO nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: ae88d565-bbf5-4c29-aee9-364c23086de5] During sync_power_state the instance has a pending task (deleting). Skip. [ 1291.040271] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "ae88d565-bbf5-4c29-aee9-364c23086de5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1291.040688] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1291.040845] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1291.042027] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-66eb1779-7d72-42bb-94ac-af14e592765e {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1291.050169] env[59518]: DEBUG oslo_vmware.api [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Waiting for the task: (returnval){ [ 1291.050169] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]5292dd83-e44f-55c0-d5bd-60528ba803b4" [ 1291.050169] env[59518]: _type = "Task" [ 1291.050169] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1291.056483] env[59518]: DEBUG oslo_vmware.api [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Task: {'id': task-308020, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067234} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1291.056967] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1291.057140] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1291.057347] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1291.057535] env[59518]: INFO nova.compute.manager [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Took 0.64 seconds to destroy the instance on the hypervisor. [ 1291.061493] env[59518]: DEBUG oslo_vmware.api [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]5292dd83-e44f-55c0-d5bd-60528ba803b4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1291.061888] env[59518]: DEBUG nova.compute.claims [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1291.062038] env[59518]: DEBUG oslo_concurrency.lockutils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1291.062232] env[59518]: DEBUG oslo_concurrency.lockutils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1291.084760] env[59518]: DEBUG oslo_concurrency.lockutils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.022s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1291.085795] env[59518]: DEBUG nova.compute.utils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Instance ac3485ac-4817-4492-a196-331002b2cc46 could not be found. {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1291.087344] env[59518]: DEBUG nova.compute.manager [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Instance disappeared during build. {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1291.087493] env[59518]: DEBUG nova.compute.manager [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1291.087643] env[59518]: DEBUG nova.compute.manager [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1291.087799] env[59518]: DEBUG nova.compute.manager [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1291.087943] env[59518]: DEBUG nova.network.neutron [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1291.098576] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1291.098778] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1291.098953] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Deleting the datastore file [datastore1] 935d4358-07b0-423b-8685-26d5bafe9e2f {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1291.099260] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5df24603-e257-4d42-9621-93f92ba777ff {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1291.105922] env[59518]: DEBUG oslo_vmware.api [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Waiting for the task: (returnval){ [ 1291.105922] env[59518]: value = "task-308024" [ 1291.105922] env[59518]: _type = "Task" [ 1291.105922] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1291.113312] env[59518]: DEBUG oslo_vmware.api [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Task: {'id': task-308024, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1291.178008] env[59518]: DEBUG neutronclient.v2_0.client [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59518) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1291.181829] env[59518]: ERROR nova.compute.manager [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] [instance: ac3485ac-4817-4492-a196-331002b2cc46] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1291.181829] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] Traceback (most recent call last): [ 1291.181829] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1291.181829] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1291.181829] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1291.181829] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] result = getattr(controller, method)(*args, **kwargs) [ 1291.181829] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1291.181829] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return self._get(image_id) [ 1291.181829] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1291.181829] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1291.181829] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1291.181829] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] resp, body = self.http_client.get(url, headers=header) [ 1291.182228] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1291.182228] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return self.request(url, 'GET', **kwargs) [ 1291.182228] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1291.182228] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return self._handle_response(resp) [ 1291.182228] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1291.182228] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] raise exc.from_response(resp, resp.content) [ 1291.182228] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1291.182228] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.182228] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] During handling of the above exception, another exception occurred: [ 1291.182228] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.182228] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] Traceback (most recent call last): [ 1291.182228] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1291.182560] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] self.driver.spawn(context, instance, image_meta, [ 1291.182560] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1291.182560] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1291.182560] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1291.182560] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] self._fetch_image_if_missing(context, vi) [ 1291.182560] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1291.182560] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] image_fetch(context, vi, tmp_image_ds_loc) [ 1291.182560] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1291.182560] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] images.fetch_image( [ 1291.182560] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1291.182560] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] metadata = IMAGE_API.get(context, image_ref) [ 1291.182560] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1291.182560] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return session.show(context, image_id, [ 1291.182972] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1291.182972] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] _reraise_translated_image_exception(image_id) [ 1291.182972] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1291.182972] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] raise new_exc.with_traceback(exc_trace) [ 1291.182972] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1291.182972] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1291.182972] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1291.182972] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] result = getattr(controller, method)(*args, **kwargs) [ 1291.182972] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1291.182972] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return self._get(image_id) [ 1291.182972] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1291.182972] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1291.182972] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1291.183438] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] resp, body = self.http_client.get(url, headers=header) [ 1291.183438] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1291.183438] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return self.request(url, 'GET', **kwargs) [ 1291.183438] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1291.183438] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return self._handle_response(resp) [ 1291.183438] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1291.183438] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] raise exc.from_response(resp, resp.content) [ 1291.183438] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] nova.exception.ImageNotAuthorized: Not authorized for image e70539a9-144d-4900-807e-914ae0cc8539. [ 1291.183438] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.183438] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] During handling of the above exception, another exception occurred: [ 1291.183438] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.183438] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] Traceback (most recent call last): [ 1291.183438] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1291.183784] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] self._build_and_run_instance(context, instance, image, [ 1291.183784] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1291.183784] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] with excutils.save_and_reraise_exception(): [ 1291.183784] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1291.183784] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] self.force_reraise() [ 1291.183784] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1291.183784] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] raise self.value [ 1291.183784] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1291.183784] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] with self.rt.instance_claim(context, instance, node, allocs, [ 1291.183784] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1291.183784] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] self.abort() [ 1291.183784] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1291.183784] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] self.tracker.abort_instance_claim(self.context, self.instance, [ 1291.184162] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1291.184162] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return f(*args, **kwargs) [ 1291.184162] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1291.184162] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] self._unset_instance_host_and_node(instance) [ 1291.184162] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1291.184162] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] instance.save() [ 1291.184162] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1291.184162] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] updates, result = self.indirection_api.object_action( [ 1291.184162] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1291.184162] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return cctxt.call(context, 'object_action', objinst=objinst, [ 1291.184162] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1291.184162] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] result = self.transport._send( [ 1291.184518] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1291.184518] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return self._driver.send(target, ctxt, message, [ 1291.184518] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1291.184518] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1291.184518] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1291.184518] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] raise result [ 1291.184518] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] nova.exception_Remote.InstanceNotFound_Remote: Instance ac3485ac-4817-4492-a196-331002b2cc46 could not be found. [ 1291.184518] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] Traceback (most recent call last): [ 1291.184518] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.184518] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1291.184518] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return getattr(target, method)(*args, **kwargs) [ 1291.184518] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.184518] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1291.184890] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return fn(self, *args, **kwargs) [ 1291.184890] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.184890] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1291.184890] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] old_ref, inst_ref = db.instance_update_and_get_original( [ 1291.184890] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.184890] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1291.184890] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return f(*args, **kwargs) [ 1291.184890] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.184890] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1291.184890] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] with excutils.save_and_reraise_exception() as ectxt: [ 1291.184890] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.184890] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1291.184890] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] self.force_reraise() [ 1291.184890] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.184890] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1291.185332] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] raise self.value [ 1291.185332] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.185332] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1291.185332] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return f(*args, **kwargs) [ 1291.185332] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.185332] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1291.185332] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return f(context, *args, **kwargs) [ 1291.185332] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.185332] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1291.185332] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1291.185332] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.185332] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1291.185332] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] raise exception.InstanceNotFound(instance_id=uuid) [ 1291.185332] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.185332] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] nova.exception.InstanceNotFound: Instance ac3485ac-4817-4492-a196-331002b2cc46 could not be found. [ 1291.185756] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.185756] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.185756] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] During handling of the above exception, another exception occurred: [ 1291.185756] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.185756] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] Traceback (most recent call last): [ 1291.185756] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.185756] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] ret = obj(*args, **kwargs) [ 1291.185756] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1291.185756] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] exception_handler_v20(status_code, error_body) [ 1291.185756] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1291.185756] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] raise client_exc(message=error_message, [ 1291.185756] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1291.185756] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] Neutron server returns request_ids: ['req-8837f4f4-3300-4c15-9a6c-4708bde0a91e'] [ 1291.185756] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.186143] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] During handling of the above exception, another exception occurred: [ 1291.186143] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.186143] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] Traceback (most recent call last): [ 1291.186143] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1291.186143] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] self._deallocate_network(context, instance, requested_networks) [ 1291.186143] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1291.186143] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] self.network_api.deallocate_for_instance( [ 1291.186143] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1291.186143] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] data = neutron.list_ports(**search_opts) [ 1291.186143] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.186143] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] ret = obj(*args, **kwargs) [ 1291.186143] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1291.186143] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return self.list('ports', self.ports_path, retrieve_all, [ 1291.186520] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.186520] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] ret = obj(*args, **kwargs) [ 1291.186520] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1291.186520] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] for r in self._pagination(collection, path, **params): [ 1291.186520] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1291.186520] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] res = self.get(path, params=params) [ 1291.186520] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.186520] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] ret = obj(*args, **kwargs) [ 1291.186520] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1291.186520] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return self.retry_request("GET", action, body=body, [ 1291.186520] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.186520] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] ret = obj(*args, **kwargs) [ 1291.186520] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1291.186871] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] return self.do_request(method, action, body=body, [ 1291.186871] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.186871] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] ret = obj(*args, **kwargs) [ 1291.186871] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1291.186871] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] self._handle_fault_response(status_code, replybody, resp) [ 1291.186871] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1291.186871] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] raise exception.Unauthorized() [ 1291.186871] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] nova.exception.Unauthorized: Not authorized. [ 1291.186871] env[59518]: ERROR nova.compute.manager [instance: ac3485ac-4817-4492-a196-331002b2cc46] [ 1291.208322] env[59518]: DEBUG oslo_concurrency.lockutils [None req-7bc749b0-2118-432a-baa5-6486fcfcbff0 tempest-ServersNegativeTestJSON-59830360 tempest-ServersNegativeTestJSON-59830360-project-member] Lock "ac3485ac-4817-4492-a196-331002b2cc46" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 606.676s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1291.454048] env[59518]: DEBUG oslo_vmware.api [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Task: {'id': task-308022, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.031011} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1291.454426] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1291.454600] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1291.454848] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1291.455080] env[59518]: INFO nova.compute.manager [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Took 0.56 seconds to destroy the instance on the hypervisor. [ 1291.455369] env[59518]: DEBUG oslo.service.loopingcall [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59518) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1291.455615] env[59518]: DEBUG nova.compute.manager [-] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1291.455762] env[59518]: DEBUG nova.network.neutron [-] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1291.543339] env[59518]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59518) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1291.543607] env[59518]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1291.544314] env[59518]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1291.544314] env[59518]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1291.544314] env[59518]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.544314] env[59518]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1291.544314] env[59518]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1291.544314] env[59518]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1291.544314] env[59518]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1291.544314] env[59518]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1291.544314] env[59518]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1291.544314] env[59518]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-d9264b30-ef8d-4e2a-bdf3-0f382c44df96'] [ 1291.544314] env[59518]: ERROR oslo.service.loopingcall [ 1291.544314] env[59518]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1291.544314] env[59518]: ERROR oslo.service.loopingcall [ 1291.544314] env[59518]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1291.544314] env[59518]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1291.544314] env[59518]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1291.545001] env[59518]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py", line 407, in _func [ 1291.545001] env[59518]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1291.545001] env[59518]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3041, in _deallocate_network_with_retries [ 1291.545001] env[59518]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1291.545001] env[59518]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1291.545001] env[59518]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1291.545001] env[59518]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1291.545001] env[59518]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1291.545001] env[59518]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.545001] env[59518]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1291.545001] env[59518]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1291.545001] env[59518]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1291.545001] env[59518]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.545001] env[59518]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1291.545001] env[59518]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1291.545001] env[59518]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1291.545001] env[59518]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1291.545001] env[59518]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1291.545757] env[59518]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.545757] env[59518]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1291.545757] env[59518]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1291.545757] env[59518]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1291.545757] env[59518]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.545757] env[59518]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1291.545757] env[59518]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1291.545757] env[59518]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1291.545757] env[59518]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.545757] env[59518]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1291.545757] env[59518]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1291.545757] env[59518]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1291.545757] env[59518]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1291.545757] env[59518]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1291.545757] env[59518]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1291.545757] env[59518]: ERROR oslo.service.loopingcall [ 1291.546422] env[59518]: ERROR nova.compute.manager [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1291.560949] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1291.561238] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Creating directory with path [datastore1] vmware_temp/fe6f00d8-ec90-400c-9fb8-b1cc9ce2d177/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1291.561461] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7236f9ee-7cf7-4258-8aba-f0d418f6922c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1291.569455] env[59518]: DEBUG nova.compute.manager [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Instance has been destroyed from under us while trying to set it to ERROR {{(pid=59518) _set_instance_obj_error_state /opt/stack/nova/nova/compute/manager.py:728}} [ 1291.569705] env[59518]: WARNING nova.compute.manager [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Could not clean up failed build, not rescheduling. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1291.569904] env[59518]: DEBUG nova.compute.claims [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1291.570050] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1291.570243] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1291.574024] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Created directory with path [datastore1] vmware_temp/fe6f00d8-ec90-400c-9fb8-b1cc9ce2d177/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1291.574186] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Fetch image to [datastore1] vmware_temp/fe6f00d8-ec90-400c-9fb8-b1cc9ce2d177/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1291.574340] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/fe6f00d8-ec90-400c-9fb8-b1cc9ce2d177/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1291.575177] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aebbb735-43c6-4ce8-b282-f3b68fc363a7 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1291.581860] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8ffd721-59c2-4ca4-a1b9-9461a82fa665 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1291.591587] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a6d266c-2d48-43c2-8542-d79605ac9590 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1291.595874] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1291.596549] env[59518]: DEBUG nova.compute.utils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Instance 7d4fa130-c399-4e8c-a711-33a08ed5dde9 could not be found. {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1291.598136] env[59518]: DEBUG nova.compute.manager [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Instance disappeared during build. {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1291.598322] env[59518]: DEBUG nova.compute.manager [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1291.598530] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquiring lock "refresh_cache-7d4fa130-c399-4e8c-a711-33a08ed5dde9" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1291.598664] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Acquired lock "refresh_cache-7d4fa130-c399-4e8c-a711-33a08ed5dde9" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1291.598872] env[59518]: DEBUG nova.network.neutron [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Building network info cache for instance {{(pid=59518) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1291.623931] env[59518]: DEBUG nova.compute.utils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Can not refresh info_cache because instance was not found {{(pid=59518) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 1291.628631] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdc5be68-fbfb-47a2-9960-fcab9f280e64 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1291.635990] env[59518]: DEBUG oslo_vmware.api [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Task: {'id': task-308024, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076668} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1291.637401] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1291.637583] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1291.637747] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1291.637906] env[59518]: INFO nova.compute.manager [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1291.639450] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2ed49dbf-9b37-440a-b45c-692ee818ba28 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1291.641224] env[59518]: DEBUG nova.compute.claims [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1291.641394] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1291.641590] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1291.645636] env[59518]: DEBUG nova.network.neutron [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Instance cache missing network info. {{(pid=59518) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1291.662418] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1291.665652] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1291.666654] env[59518]: DEBUG nova.compute.utils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Instance 935d4358-07b0-423b-8685-26d5bafe9e2f could not be found. {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1291.668019] env[59518]: DEBUG nova.compute.manager [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Instance disappeared during build. {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1291.668242] env[59518]: DEBUG nova.compute.manager [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1291.668408] env[59518]: DEBUG nova.compute.manager [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1291.668555] env[59518]: DEBUG nova.compute.manager [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1291.668707] env[59518]: DEBUG nova.network.neutron [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1291.698237] env[59518]: DEBUG nova.network.neutron [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1291.706136] env[59518]: DEBUG neutronclient.v2_0.client [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59518) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1291.707701] env[59518]: ERROR nova.compute.manager [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1291.707701] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Traceback (most recent call last): [ 1291.707701] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1291.707701] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1291.707701] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1291.707701] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] result = getattr(controller, method)(*args, **kwargs) [ 1291.707701] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1291.707701] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return self._get(image_id) [ 1291.707701] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1291.707701] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1291.707701] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1291.707701] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] resp, body = self.http_client.get(url, headers=header) [ 1291.708134] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1291.708134] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return self.request(url, 'GET', **kwargs) [ 1291.708134] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1291.708134] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return self._handle_response(resp) [ 1291.708134] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1291.708134] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] raise exc.from_response(resp, resp.content) [ 1291.708134] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1291.708134] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.708134] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] During handling of the above exception, another exception occurred: [ 1291.708134] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.708134] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Traceback (most recent call last): [ 1291.708134] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1291.708469] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] self.driver.spawn(context, instance, image_meta, [ 1291.708469] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1291.708469] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1291.708469] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1291.708469] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] self._fetch_image_if_missing(context, vi) [ 1291.708469] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1291.708469] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] image_fetch(context, vi, tmp_image_ds_loc) [ 1291.708469] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1291.708469] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] images.fetch_image( [ 1291.708469] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1291.708469] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] metadata = IMAGE_API.get(context, image_ref) [ 1291.708469] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1291.708469] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return session.show(context, image_id, [ 1291.708820] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1291.708820] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] _reraise_translated_image_exception(image_id) [ 1291.708820] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1291.708820] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] raise new_exc.with_traceback(exc_trace) [ 1291.708820] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1291.708820] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1291.708820] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1291.708820] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] result = getattr(controller, method)(*args, **kwargs) [ 1291.708820] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1291.708820] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return self._get(image_id) [ 1291.708820] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1291.708820] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1291.708820] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1291.709165] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] resp, body = self.http_client.get(url, headers=header) [ 1291.709165] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1291.709165] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return self.request(url, 'GET', **kwargs) [ 1291.709165] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1291.709165] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return self._handle_response(resp) [ 1291.709165] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1291.709165] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] raise exc.from_response(resp, resp.content) [ 1291.709165] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] nova.exception.ImageNotAuthorized: Not authorized for image e70539a9-144d-4900-807e-914ae0cc8539. [ 1291.709165] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.709165] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] During handling of the above exception, another exception occurred: [ 1291.709165] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.709165] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Traceback (most recent call last): [ 1291.709165] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1291.709537] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] self._build_and_run_instance(context, instance, image, [ 1291.709537] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1291.709537] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] with excutils.save_and_reraise_exception(): [ 1291.709537] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1291.709537] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] self.force_reraise() [ 1291.709537] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1291.709537] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] raise self.value [ 1291.709537] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1291.709537] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] with self.rt.instance_claim(context, instance, node, allocs, [ 1291.709537] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1291.709537] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] self.abort() [ 1291.709537] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1291.709537] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] self.tracker.abort_instance_claim(self.context, self.instance, [ 1291.709877] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1291.709877] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return f(*args, **kwargs) [ 1291.709877] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1291.709877] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] self._unset_instance_host_and_node(instance) [ 1291.709877] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1291.709877] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] instance.save() [ 1291.709877] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1291.709877] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] updates, result = self.indirection_api.object_action( [ 1291.709877] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1291.709877] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return cctxt.call(context, 'object_action', objinst=objinst, [ 1291.709877] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1291.709877] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] result = self.transport._send( [ 1291.710198] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1291.710198] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return self._driver.send(target, ctxt, message, [ 1291.710198] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1291.710198] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1291.710198] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1291.710198] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] raise result [ 1291.710198] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] nova.exception_Remote.InstanceNotFound_Remote: Instance 935d4358-07b0-423b-8685-26d5bafe9e2f could not be found. [ 1291.710198] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Traceback (most recent call last): [ 1291.710198] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.710198] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1291.710198] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return getattr(target, method)(*args, **kwargs) [ 1291.710198] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.710198] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1291.710535] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return fn(self, *args, **kwargs) [ 1291.710535] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.710535] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1291.710535] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] old_ref, inst_ref = db.instance_update_and_get_original( [ 1291.710535] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.710535] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1291.710535] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return f(*args, **kwargs) [ 1291.710535] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.710535] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1291.710535] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] with excutils.save_and_reraise_exception() as ectxt: [ 1291.710535] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.710535] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1291.710535] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] self.force_reraise() [ 1291.710535] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.710535] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1291.710950] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] raise self.value [ 1291.710950] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.710950] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1291.710950] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return f(*args, **kwargs) [ 1291.710950] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.710950] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1291.710950] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return f(context, *args, **kwargs) [ 1291.710950] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.710950] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1291.710950] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1291.710950] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.710950] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1291.710950] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] raise exception.InstanceNotFound(instance_id=uuid) [ 1291.710950] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.710950] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] nova.exception.InstanceNotFound: Instance 935d4358-07b0-423b-8685-26d5bafe9e2f could not be found. [ 1291.711460] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.711460] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.711460] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] During handling of the above exception, another exception occurred: [ 1291.711460] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.711460] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Traceback (most recent call last): [ 1291.711460] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.711460] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] ret = obj(*args, **kwargs) [ 1291.711460] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1291.711460] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] exception_handler_v20(status_code, error_body) [ 1291.711460] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1291.711460] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] raise client_exc(message=error_message, [ 1291.711460] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1291.711460] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Neutron server returns request_ids: ['req-9fd8928f-acb1-44d2-826d-769b79bed2c7'] [ 1291.711460] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.711828] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] During handling of the above exception, another exception occurred: [ 1291.711828] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.711828] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] Traceback (most recent call last): [ 1291.711828] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1291.711828] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] self._deallocate_network(context, instance, requested_networks) [ 1291.711828] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1291.711828] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] self.network_api.deallocate_for_instance( [ 1291.711828] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1291.711828] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] data = neutron.list_ports(**search_opts) [ 1291.711828] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.711828] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] ret = obj(*args, **kwargs) [ 1291.711828] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1291.711828] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return self.list('ports', self.ports_path, retrieve_all, [ 1291.712170] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.712170] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] ret = obj(*args, **kwargs) [ 1291.712170] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1291.712170] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] for r in self._pagination(collection, path, **params): [ 1291.712170] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1291.712170] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] res = self.get(path, params=params) [ 1291.712170] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.712170] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] ret = obj(*args, **kwargs) [ 1291.712170] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1291.712170] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return self.retry_request("GET", action, body=body, [ 1291.712170] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.712170] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] ret = obj(*args, **kwargs) [ 1291.712170] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1291.712495] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] return self.do_request(method, action, body=body, [ 1291.712495] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.712495] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] ret = obj(*args, **kwargs) [ 1291.712495] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1291.712495] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] self._handle_fault_response(status_code, replybody, resp) [ 1291.712495] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1291.712495] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] raise exception.Unauthorized() [ 1291.712495] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] nova.exception.Unauthorized: Not authorized. [ 1291.712495] env[59518]: ERROR nova.compute.manager [instance: 935d4358-07b0-423b-8685-26d5bafe9e2f] [ 1291.712495] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Releasing lock "refresh_cache-7d4fa130-c399-4e8c-a711-33a08ed5dde9" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1291.712767] env[59518]: DEBUG nova.compute.manager [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1291.712767] env[59518]: DEBUG nova.compute.manager [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1291.712767] env[59518]: DEBUG nova.network.neutron [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1291.728937] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1e35efbb-6bfa-4445-8dc0-b9b2c4fa2332 tempest-MigrationsAdminTest-462693303 tempest-MigrationsAdminTest-462693303-project-member] Lock "935d4358-07b0-423b-8685-26d5bafe9e2f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 487.075s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1291.759577] env[59518]: DEBUG oslo_concurrency.lockutils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1291.760264] env[59518]: ERROR nova.compute.manager [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image e70539a9-144d-4900-807e-914ae0cc8539. [ 1291.760264] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Traceback (most recent call last): [ 1291.760264] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1291.760264] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1291.760264] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1291.760264] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] result = getattr(controller, method)(*args, **kwargs) [ 1291.760264] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1291.760264] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return self._get(image_id) [ 1291.760264] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1291.760264] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1291.760264] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1291.760684] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] resp, body = self.http_client.get(url, headers=header) [ 1291.760684] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1291.760684] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return self.request(url, 'GET', **kwargs) [ 1291.760684] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1291.760684] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return self._handle_response(resp) [ 1291.760684] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1291.760684] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] raise exc.from_response(resp, resp.content) [ 1291.760684] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1291.760684] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1291.760684] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] During handling of the above exception, another exception occurred: [ 1291.760684] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1291.760684] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Traceback (most recent call last): [ 1291.761204] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1291.761204] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] yield resources [ 1291.761204] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1291.761204] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] self.driver.spawn(context, instance, image_meta, [ 1291.761204] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1291.761204] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1291.761204] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1291.761204] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] self._fetch_image_if_missing(context, vi) [ 1291.761204] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1291.761204] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] image_fetch(context, vi, tmp_image_ds_loc) [ 1291.761204] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1291.761204] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] images.fetch_image( [ 1291.761204] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1291.761698] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] metadata = IMAGE_API.get(context, image_ref) [ 1291.761698] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1291.761698] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return session.show(context, image_id, [ 1291.761698] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1291.761698] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] _reraise_translated_image_exception(image_id) [ 1291.761698] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1291.761698] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] raise new_exc.with_traceback(exc_trace) [ 1291.761698] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1291.761698] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1291.761698] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1291.761698] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] result = getattr(controller, method)(*args, **kwargs) [ 1291.761698] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1291.761698] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return self._get(image_id) [ 1291.762090] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1291.762090] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1291.762090] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1291.762090] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] resp, body = self.http_client.get(url, headers=header) [ 1291.762090] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1291.762090] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return self.request(url, 'GET', **kwargs) [ 1291.762090] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1291.762090] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return self._handle_response(resp) [ 1291.762090] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1291.762090] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] raise exc.from_response(resp, resp.content) [ 1291.762090] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] nova.exception.ImageNotAuthorized: Not authorized for image e70539a9-144d-4900-807e-914ae0cc8539. [ 1291.762090] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1291.762436] env[59518]: INFO nova.compute.manager [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Terminating instance [ 1291.762436] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1291.762436] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1291.762436] env[59518]: DEBUG nova.compute.manager [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1291.762560] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1291.764073] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c57f9772-9a94-4ac9-a9c2-7d8bfb646eb1 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1291.766673] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85d7c279-d0ba-47f7-b034-8a3cce588266 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1291.774043] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1291.774180] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-33ed7682-8cf6-4aef-af0b-7cb47e0c3fd3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1291.776313] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1291.776740] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1291.777354] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-95c270c0-5d46-4e4e-a1aa-8668435f105d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1291.781944] env[59518]: DEBUG oslo_vmware.api [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Waiting for the task: (returnval){ [ 1291.781944] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52dd35fc-8654-6c61-02a1-e174d6231587" [ 1291.781944] env[59518]: _type = "Task" [ 1291.781944] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1291.789591] env[59518]: DEBUG oslo_vmware.api [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52dd35fc-8654-6c61-02a1-e174d6231587, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1291.828603] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1291.828800] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1291.829049] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Deleting the datastore file [datastore1] 3981aa30-0515-4764-9aac-d0c99a48b064 {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1291.829308] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4bed380d-9ea4-4907-9098-0102a19af818 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1291.835721] env[59518]: DEBUG oslo_vmware.api [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Waiting for the task: (returnval){ [ 1291.835721] env[59518]: value = "task-308026" [ 1291.835721] env[59518]: _type = "Task" [ 1291.835721] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1291.838826] env[59518]: DEBUG neutronclient.v2_0.client [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59518) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1291.840611] env[59518]: ERROR nova.compute.manager [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1291.840611] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Traceback (most recent call last): [ 1291.840611] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.840611] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] ret = obj(*args, **kwargs) [ 1291.840611] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1291.840611] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] exception_handler_v20(status_code, error_body) [ 1291.840611] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1291.840611] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] raise client_exc(message=error_message, [ 1291.840611] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1291.840611] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Neutron server returns request_ids: ['req-d9264b30-ef8d-4e2a-bdf3-0f382c44df96'] [ 1291.840611] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.840918] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] During handling of the above exception, another exception occurred: [ 1291.840918] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.840918] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Traceback (most recent call last): [ 1291.840918] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/compute/manager.py", line 2881, in _build_resources [ 1291.840918] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] self._shutdown_instance(context, instance, [ 1291.840918] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/compute/manager.py", line 3140, in _shutdown_instance [ 1291.840918] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] self._try_deallocate_network(context, instance, requested_networks) [ 1291.840918] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/compute/manager.py", line 3054, in _try_deallocate_network [ 1291.840918] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] with excutils.save_and_reraise_exception(): [ 1291.840918] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1291.840918] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] self.force_reraise() [ 1291.840918] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1291.840918] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] raise self.value [ 1291.841247] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/compute/manager.py", line 3052, in _try_deallocate_network [ 1291.841247] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] _deallocate_network_with_retries() [ 1291.841247] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py", line 436, in func [ 1291.841247] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return evt.wait() [ 1291.841247] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1291.841247] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] result = hub.switch() [ 1291.841247] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1291.841247] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return self.greenlet.switch() [ 1291.841247] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1291.841247] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] result = func(*self.args, **self.kw) [ 1291.841247] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py", line 407, in _func [ 1291.841247] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] result = f(*args, **kwargs) [ 1291.841247] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/compute/manager.py", line 3041, in _deallocate_network_with_retries [ 1291.841559] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] self._deallocate_network( [ 1291.841559] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1291.841559] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] self.network_api.deallocate_for_instance( [ 1291.841559] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1291.841559] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] data = neutron.list_ports(**search_opts) [ 1291.841559] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.841559] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] ret = obj(*args, **kwargs) [ 1291.841559] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1291.841559] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return self.list('ports', self.ports_path, retrieve_all, [ 1291.841559] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.841559] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] ret = obj(*args, **kwargs) [ 1291.841559] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1291.841559] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] for r in self._pagination(collection, path, **params): [ 1291.841872] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1291.841872] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] res = self.get(path, params=params) [ 1291.841872] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.841872] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] ret = obj(*args, **kwargs) [ 1291.841872] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1291.841872] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return self.retry_request("GET", action, body=body, [ 1291.841872] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.841872] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] ret = obj(*args, **kwargs) [ 1291.841872] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1291.841872] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return self.do_request(method, action, body=body, [ 1291.841872] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.841872] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] ret = obj(*args, **kwargs) [ 1291.841872] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1291.842218] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] self._handle_fault_response(status_code, replybody, resp) [ 1291.842218] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1291.842218] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1291.842218] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1291.842218] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.842218] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] During handling of the above exception, another exception occurred: [ 1291.842218] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.842218] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Traceback (most recent call last): [ 1291.842218] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/compute/manager.py", line 2594, in _build_and_run_instance [ 1291.842218] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] with self._build_resources(context, instance, [ 1291.842218] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/lib/python3.10/contextlib.py", line 153, in __exit__ [ 1291.842218] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] self.gen.throw(typ, value, traceback) [ 1291.842218] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/compute/manager.py", line 2889, in _build_resources [ 1291.842218] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] raise exception.BuildAbortException( [ 1291.842560] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] nova.exception.BuildAbortException: Build of instance 7d4fa130-c399-4e8c-a711-33a08ed5dde9 aborted: Not authorized for image e70539a9-144d-4900-807e-914ae0cc8539. [ 1291.842560] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.842560] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] During handling of the above exception, another exception occurred: [ 1291.842560] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.842560] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Traceback (most recent call last): [ 1291.842560] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1291.842560] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] self._build_and_run_instance(context, instance, image, [ 1291.842560] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1291.842560] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] with excutils.save_and_reraise_exception(): [ 1291.842560] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1291.842560] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] self.force_reraise() [ 1291.842560] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1291.842560] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] raise self.value [ 1291.842859] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1291.842859] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] with self.rt.instance_claim(context, instance, node, allocs, [ 1291.842859] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1291.842859] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] self.abort() [ 1291.842859] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1291.842859] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] self.tracker.abort_instance_claim(self.context, self.instance, [ 1291.842859] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1291.842859] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return f(*args, **kwargs) [ 1291.842859] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1291.842859] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] self._unset_instance_host_and_node(instance) [ 1291.842859] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1291.842859] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] instance.save() [ 1291.842859] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1291.843192] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] updates, result = self.indirection_api.object_action( [ 1291.843192] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1291.843192] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return cctxt.call(context, 'object_action', objinst=objinst, [ 1291.843192] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1291.843192] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] result = self.transport._send( [ 1291.843192] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1291.843192] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return self._driver.send(target, ctxt, message, [ 1291.843192] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1291.843192] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1291.843192] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1291.843192] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] raise result [ 1291.843192] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] nova.exception_Remote.InstanceNotFound_Remote: Instance 7d4fa130-c399-4e8c-a711-33a08ed5dde9 could not be found. [ 1291.843630] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Traceback (most recent call last): [ 1291.843630] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.843630] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1291.843630] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return getattr(target, method)(*args, **kwargs) [ 1291.843630] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.843630] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1291.843630] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return fn(self, *args, **kwargs) [ 1291.843630] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.843630] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1291.843630] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] old_ref, inst_ref = db.instance_update_and_get_original( [ 1291.843630] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.843630] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1291.843630] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return f(*args, **kwargs) [ 1291.843630] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.843630] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1291.844292] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] with excutils.save_and_reraise_exception() as ectxt: [ 1291.844292] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.844292] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1291.844292] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] self.force_reraise() [ 1291.844292] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.844292] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1291.844292] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] raise self.value [ 1291.844292] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.844292] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1291.844292] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return f(*args, **kwargs) [ 1291.844292] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.844292] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1291.844292] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return f(context, *args, **kwargs) [ 1291.844292] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.844292] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1291.844790] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1291.844790] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.844790] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1291.844790] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] raise exception.InstanceNotFound(instance_id=uuid) [ 1291.844790] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.844790] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] nova.exception.InstanceNotFound: Instance 7d4fa130-c399-4e8c-a711-33a08ed5dde9 could not be found. [ 1291.844790] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.844790] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.844790] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] During handling of the above exception, another exception occurred: [ 1291.844790] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.844790] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Traceback (most recent call last): [ 1291.844790] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.844790] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] ret = obj(*args, **kwargs) [ 1291.844790] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1291.844790] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] exception_handler_v20(status_code, error_body) [ 1291.845217] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1291.845217] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] raise client_exc(message=error_message, [ 1291.845217] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1291.845217] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Neutron server returns request_ids: ['req-ecb1ecef-563d-44d4-bb4b-c431371af16d'] [ 1291.845217] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.845217] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] During handling of the above exception, another exception occurred: [ 1291.845217] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.845217] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] Traceback (most recent call last): [ 1291.845217] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1291.845217] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] self._deallocate_network(context, instance, requested_networks) [ 1291.845217] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1291.845217] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] self.network_api.deallocate_for_instance( [ 1291.845217] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1291.845604] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] data = neutron.list_ports(**search_opts) [ 1291.845604] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.845604] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] ret = obj(*args, **kwargs) [ 1291.845604] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1291.845604] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return self.list('ports', self.ports_path, retrieve_all, [ 1291.845604] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.845604] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] ret = obj(*args, **kwargs) [ 1291.845604] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1291.845604] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] for r in self._pagination(collection, path, **params): [ 1291.845604] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1291.845604] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] res = self.get(path, params=params) [ 1291.845604] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.845604] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] ret = obj(*args, **kwargs) [ 1291.845984] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1291.845984] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return self.retry_request("GET", action, body=body, [ 1291.845984] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.845984] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] ret = obj(*args, **kwargs) [ 1291.845984] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1291.845984] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] return self.do_request(method, action, body=body, [ 1291.845984] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1291.845984] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] ret = obj(*args, **kwargs) [ 1291.845984] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1291.845984] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] self._handle_fault_response(status_code, replybody, resp) [ 1291.845984] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1291.845984] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] raise exception.Unauthorized() [ 1291.845984] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] nova.exception.Unauthorized: Not authorized. [ 1291.846389] env[59518]: ERROR nova.compute.manager [instance: 7d4fa130-c399-4e8c-a711-33a08ed5dde9] [ 1291.846389] env[59518]: DEBUG oslo_vmware.api [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Task: {'id': task-308026, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1291.866727] env[59518]: DEBUG oslo_concurrency.lockutils [None req-1d8a99ef-29f9-4437-bd26-d2c11e8301cc tempest-DeleteServersAdminTestJSON-1007061305 tempest-DeleteServersAdminTestJSON-1007061305-project-member] Lock "7d4fa130-c399-4e8c-a711-33a08ed5dde9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 587.281s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1292.293747] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1292.294082] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Creating directory with path [datastore1] vmware_temp/4b6b60dd-89db-4a4c-8263-dbbb7f13e6c6/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1292.294365] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-daa040ed-6752-4a33-82b6-d2e68c68a0dc {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.306178] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Created directory with path [datastore1] vmware_temp/4b6b60dd-89db-4a4c-8263-dbbb7f13e6c6/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1292.306435] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Fetch image to [datastore1] vmware_temp/4b6b60dd-89db-4a4c-8263-dbbb7f13e6c6/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1292.306668] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/4b6b60dd-89db-4a4c-8263-dbbb7f13e6c6/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1292.307414] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-431590aa-ce8d-4f17-9bd4-8a54906a25f1 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.314709] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0d8fb88-7a4c-412d-bb1a-707573acc0b3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.323583] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f589dd41-a860-4fb9-9b11-c086c50f4f97 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.355692] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d793920-9d4b-4c3e-a7e1-392305f64c36 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.364044] env[59518]: DEBUG oslo_vmware.api [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Task: {'id': task-308026, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.087713} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1292.365707] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1292.365971] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1292.366280] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1292.366524] env[59518]: INFO nova.compute.manager [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1292.368595] env[59518]: DEBUG nova.compute.claims [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1292.368817] env[59518]: DEBUG oslo_concurrency.lockutils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1292.369084] env[59518]: DEBUG oslo_concurrency.lockutils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1292.372279] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b2284a32-f280-43d3-a8c2-a5691946d7ae {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.392917] env[59518]: DEBUG oslo_concurrency.lockutils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1292.393729] env[59518]: DEBUG nova.compute.utils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Instance 3981aa30-0515-4764-9aac-d0c99a48b064 could not be found. {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1292.395256] env[59518]: DEBUG nova.compute.manager [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Instance disappeared during build. {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1292.395543] env[59518]: DEBUG nova.compute.manager [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1292.395787] env[59518]: DEBUG nova.compute.manager [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1292.396029] env[59518]: DEBUG nova.compute.manager [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1292.396248] env[59518]: DEBUG nova.network.neutron [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1292.399717] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1292.426056] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1292.426840] env[59518]: ERROR nova.compute.manager [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image e70539a9-144d-4900-807e-914ae0cc8539. [ 1292.426840] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Traceback (most recent call last): [ 1292.426840] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1292.426840] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1292.426840] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1292.426840] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] result = getattr(controller, method)(*args, **kwargs) [ 1292.426840] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1292.426840] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return self._get(image_id) [ 1292.426840] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1292.426840] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1292.426840] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1292.427144] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] resp, body = self.http_client.get(url, headers=header) [ 1292.427144] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1292.427144] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return self.request(url, 'GET', **kwargs) [ 1292.427144] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1292.427144] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return self._handle_response(resp) [ 1292.427144] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1292.427144] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] raise exc.from_response(resp, resp.content) [ 1292.427144] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1292.427144] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1292.427144] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] During handling of the above exception, another exception occurred: [ 1292.427144] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1292.427144] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Traceback (most recent call last): [ 1292.427447] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1292.427447] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] yield resources [ 1292.427447] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1292.427447] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] self.driver.spawn(context, instance, image_meta, [ 1292.427447] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1292.427447] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1292.427447] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1292.427447] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] self._fetch_image_if_missing(context, vi) [ 1292.427447] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1292.427447] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] image_fetch(context, vi, tmp_image_ds_loc) [ 1292.427447] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1292.427447] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] images.fetch_image( [ 1292.427447] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1292.427776] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] metadata = IMAGE_API.get(context, image_ref) [ 1292.427776] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1292.427776] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return session.show(context, image_id, [ 1292.427776] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1292.427776] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] _reraise_translated_image_exception(image_id) [ 1292.427776] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1292.427776] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] raise new_exc.with_traceback(exc_trace) [ 1292.427776] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1292.427776] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1292.427776] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1292.427776] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] result = getattr(controller, method)(*args, **kwargs) [ 1292.427776] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1292.427776] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return self._get(image_id) [ 1292.428200] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1292.428200] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1292.428200] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1292.428200] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] resp, body = self.http_client.get(url, headers=header) [ 1292.428200] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1292.428200] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return self.request(url, 'GET', **kwargs) [ 1292.428200] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1292.428200] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return self._handle_response(resp) [ 1292.428200] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1292.428200] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] raise exc.from_response(resp, resp.content) [ 1292.428200] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] nova.exception.ImageNotAuthorized: Not authorized for image e70539a9-144d-4900-807e-914ae0cc8539. [ 1292.428200] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1292.428502] env[59518]: INFO nova.compute.manager [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Terminating instance [ 1292.430171] env[59518]: DEBUG nova.compute.manager [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1292.430421] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1292.430756] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1292.431030] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1292.431858] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-962795b7-5768-4a58-a0c7-0b83fd54b918 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.434663] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4edc40bc-1f13-4d1c-8fd0-433b90a0a0b9 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.440893] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1292.441185] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-336b2754-444b-45ad-a472-3405e98b44cf {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.444586] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1292.444839] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1292.445801] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-549a40d2-ad96-46f8-9413-a387fc99c4b0 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.450583] env[59518]: DEBUG oslo_vmware.api [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Waiting for the task: (returnval){ [ 1292.450583] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]5249270a-2500-f181-9bf8-39824f6308d6" [ 1292.450583] env[59518]: _type = "Task" [ 1292.450583] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1292.457980] env[59518]: DEBUG oslo_vmware.api [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]5249270a-2500-f181-9bf8-39824f6308d6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1292.491979] env[59518]: DEBUG neutronclient.v2_0.client [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59518) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1292.494130] env[59518]: ERROR nova.compute.manager [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1292.494130] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Traceback (most recent call last): [ 1292.494130] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1292.494130] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1292.494130] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1292.494130] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] result = getattr(controller, method)(*args, **kwargs) [ 1292.494130] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1292.494130] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return self._get(image_id) [ 1292.494130] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1292.494130] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1292.494130] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1292.494130] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] resp, body = self.http_client.get(url, headers=header) [ 1292.494490] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1292.494490] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return self.request(url, 'GET', **kwargs) [ 1292.494490] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1292.494490] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return self._handle_response(resp) [ 1292.494490] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1292.494490] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] raise exc.from_response(resp, resp.content) [ 1292.494490] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1292.494490] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.494490] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] During handling of the above exception, another exception occurred: [ 1292.494490] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.494490] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Traceback (most recent call last): [ 1292.494490] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1292.494788] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] self.driver.spawn(context, instance, image_meta, [ 1292.494788] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1292.494788] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1292.494788] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1292.494788] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] self._fetch_image_if_missing(context, vi) [ 1292.494788] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1292.494788] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] image_fetch(context, vi, tmp_image_ds_loc) [ 1292.494788] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1292.494788] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] images.fetch_image( [ 1292.494788] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1292.494788] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] metadata = IMAGE_API.get(context, image_ref) [ 1292.494788] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1292.494788] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return session.show(context, image_id, [ 1292.495125] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1292.495125] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] _reraise_translated_image_exception(image_id) [ 1292.495125] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1292.495125] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] raise new_exc.with_traceback(exc_trace) [ 1292.495125] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1292.495125] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1292.495125] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1292.495125] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] result = getattr(controller, method)(*args, **kwargs) [ 1292.495125] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1292.495125] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return self._get(image_id) [ 1292.495125] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1292.495125] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1292.495125] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1292.495446] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] resp, body = self.http_client.get(url, headers=header) [ 1292.495446] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1292.495446] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return self.request(url, 'GET', **kwargs) [ 1292.495446] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1292.495446] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return self._handle_response(resp) [ 1292.495446] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1292.495446] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] raise exc.from_response(resp, resp.content) [ 1292.495446] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] nova.exception.ImageNotAuthorized: Not authorized for image e70539a9-144d-4900-807e-914ae0cc8539. [ 1292.495446] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.495446] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] During handling of the above exception, another exception occurred: [ 1292.495446] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.495446] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Traceback (most recent call last): [ 1292.495446] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1292.495767] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] self._build_and_run_instance(context, instance, image, [ 1292.495767] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1292.495767] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] with excutils.save_and_reraise_exception(): [ 1292.495767] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1292.495767] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] self.force_reraise() [ 1292.495767] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1292.495767] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] raise self.value [ 1292.495767] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1292.495767] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] with self.rt.instance_claim(context, instance, node, allocs, [ 1292.495767] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1292.495767] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] self.abort() [ 1292.495767] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1292.495767] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] self.tracker.abort_instance_claim(self.context, self.instance, [ 1292.496111] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1292.496111] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return f(*args, **kwargs) [ 1292.496111] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1292.496111] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] self._unset_instance_host_and_node(instance) [ 1292.496111] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1292.496111] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] instance.save() [ 1292.496111] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1292.496111] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] updates, result = self.indirection_api.object_action( [ 1292.496111] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1292.496111] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return cctxt.call(context, 'object_action', objinst=objinst, [ 1292.496111] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1292.496111] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] result = self.transport._send( [ 1292.496434] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1292.496434] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return self._driver.send(target, ctxt, message, [ 1292.496434] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1292.496434] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1292.496434] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1292.496434] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] raise result [ 1292.496434] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] nova.exception_Remote.InstanceNotFound_Remote: Instance 3981aa30-0515-4764-9aac-d0c99a48b064 could not be found. [ 1292.496434] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Traceback (most recent call last): [ 1292.496434] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.496434] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1292.496434] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return getattr(target, method)(*args, **kwargs) [ 1292.496434] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.496434] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1292.496872] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return fn(self, *args, **kwargs) [ 1292.496872] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.496872] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1292.496872] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] old_ref, inst_ref = db.instance_update_and_get_original( [ 1292.496872] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.496872] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1292.496872] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return f(*args, **kwargs) [ 1292.496872] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.496872] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1292.496872] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] with excutils.save_and_reraise_exception() as ectxt: [ 1292.496872] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.496872] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1292.496872] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] self.force_reraise() [ 1292.496872] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.496872] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1292.497306] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] raise self.value [ 1292.497306] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.497306] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1292.497306] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return f(*args, **kwargs) [ 1292.497306] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.497306] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1292.497306] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return f(context, *args, **kwargs) [ 1292.497306] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.497306] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1292.497306] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1292.497306] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.497306] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1292.497306] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] raise exception.InstanceNotFound(instance_id=uuid) [ 1292.497306] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.497306] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] nova.exception.InstanceNotFound: Instance 3981aa30-0515-4764-9aac-d0c99a48b064 could not be found. [ 1292.497663] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.497663] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.497663] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] During handling of the above exception, another exception occurred: [ 1292.497663] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.497663] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Traceback (most recent call last): [ 1292.497663] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1292.497663] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] ret = obj(*args, **kwargs) [ 1292.497663] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1292.497663] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] exception_handler_v20(status_code, error_body) [ 1292.497663] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1292.497663] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] raise client_exc(message=error_message, [ 1292.497663] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1292.497663] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Neutron server returns request_ids: ['req-3fda1f64-deb7-45dc-b28f-58fc92491880'] [ 1292.497663] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.497990] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] During handling of the above exception, another exception occurred: [ 1292.497990] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.497990] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] Traceback (most recent call last): [ 1292.497990] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1292.497990] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] self._deallocate_network(context, instance, requested_networks) [ 1292.497990] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1292.497990] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] self.network_api.deallocate_for_instance( [ 1292.497990] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1292.497990] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] data = neutron.list_ports(**search_opts) [ 1292.497990] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1292.497990] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] ret = obj(*args, **kwargs) [ 1292.497990] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1292.497990] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return self.list('ports', self.ports_path, retrieve_all, [ 1292.498343] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1292.498343] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] ret = obj(*args, **kwargs) [ 1292.498343] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1292.498343] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] for r in self._pagination(collection, path, **params): [ 1292.498343] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1292.498343] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] res = self.get(path, params=params) [ 1292.498343] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1292.498343] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] ret = obj(*args, **kwargs) [ 1292.498343] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1292.498343] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return self.retry_request("GET", action, body=body, [ 1292.498343] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1292.498343] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] ret = obj(*args, **kwargs) [ 1292.498343] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1292.498650] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] return self.do_request(method, action, body=body, [ 1292.498650] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1292.498650] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] ret = obj(*args, **kwargs) [ 1292.498650] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1292.498650] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] self._handle_fault_response(status_code, replybody, resp) [ 1292.498650] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1292.498650] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] raise exception.Unauthorized() [ 1292.498650] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] nova.exception.Unauthorized: Not authorized. [ 1292.498650] env[59518]: ERROR nova.compute.manager [instance: 3981aa30-0515-4764-9aac-d0c99a48b064] [ 1292.499522] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1292.499885] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1292.500221] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Deleting the datastore file [datastore1] 39cfe606-43a0-4a52-8ec1-433baf7a3aec {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1292.500598] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5d90e1e1-e064-43fd-8622-0ede4a396ad4 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.512187] env[59518]: DEBUG oslo_vmware.api [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Waiting for the task: (returnval){ [ 1292.512187] env[59518]: value = "task-308028" [ 1292.512187] env[59518]: _type = "Task" [ 1292.512187] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1292.518517] env[59518]: DEBUG oslo_concurrency.lockutils [None req-551773ca-ea59-4d4c-999c-5f6dba78ddae tempest-ServerMetadataTestJSON-1358222584 tempest-ServerMetadataTestJSON-1358222584-project-member] Lock "3981aa30-0515-4764-9aac-d0c99a48b064" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 477.244s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1292.523610] env[59518]: DEBUG oslo_vmware.api [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Task: {'id': task-308028, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1292.960829] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1292.961216] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Creating directory with path [datastore1] vmware_temp/6c06f1b3-286f-4f8b-9456-a985e9e04a47/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1292.961299] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5145978f-9169-473b-b58e-1866ab6d7348 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.972675] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Created directory with path [datastore1] vmware_temp/6c06f1b3-286f-4f8b-9456-a985e9e04a47/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1292.972804] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Fetch image to [datastore1] vmware_temp/6c06f1b3-286f-4f8b-9456-a985e9e04a47/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1292.972963] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/6c06f1b3-286f-4f8b-9456-a985e9e04a47/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1292.973617] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae45f541-9c26-40a1-a352-40f4bdc685b6 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.979799] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42781db4-61ce-4c80-ae62-98ed500e0c27 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1292.988379] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fef7c84f-2a62-48fc-856f-b9cacbae83c3 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1293.021564] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48b47adf-654a-4f8d-b769-0a591e5b84d5 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1293.027936] env[59518]: DEBUG oslo_vmware.api [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Task: {'id': task-308028, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075397} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1293.029409] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1293.029591] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1293.029753] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1293.029915] env[59518]: INFO nova.compute.manager [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1293.031902] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-15d382d4-3bab-43e1-9368-ceaea2dc0b99 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1293.033665] env[59518]: DEBUG nova.compute.claims [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1293.033816] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1293.034011] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1293.057050] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.023s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1293.057962] env[59518]: DEBUG nova.compute.utils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Instance 39cfe606-43a0-4a52-8ec1-433baf7a3aec could not be found. {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1293.061134] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1293.063402] env[59518]: DEBUG nova.compute.manager [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Instance disappeared during build. {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1293.063594] env[59518]: DEBUG nova.compute.manager [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1293.063765] env[59518]: DEBUG nova.compute.manager [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1293.064071] env[59518]: DEBUG nova.compute.manager [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1293.064104] env[59518]: DEBUG nova.network.neutron [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1293.147417] env[59518]: DEBUG oslo_vmware.rw_handles [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6c06f1b3-286f-4f8b-9456-a985e9e04a47/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1293.197489] env[59518]: DEBUG neutronclient.v2_0.client [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59518) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1293.199064] env[59518]: ERROR nova.compute.manager [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1293.199064] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Traceback (most recent call last): [ 1293.199064] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1293.199064] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1293.199064] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1293.199064] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] result = getattr(controller, method)(*args, **kwargs) [ 1293.199064] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1293.199064] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return self._get(image_id) [ 1293.199064] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1293.199064] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1293.199064] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1293.199064] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] resp, body = self.http_client.get(url, headers=header) [ 1293.199503] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1293.199503] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return self.request(url, 'GET', **kwargs) [ 1293.199503] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1293.199503] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return self._handle_response(resp) [ 1293.199503] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1293.199503] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] raise exc.from_response(resp, resp.content) [ 1293.199503] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1293.199503] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.199503] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] During handling of the above exception, another exception occurred: [ 1293.199503] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.199503] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Traceback (most recent call last): [ 1293.199503] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1293.199838] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] self.driver.spawn(context, instance, image_meta, [ 1293.199838] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1293.199838] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1293.199838] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1293.199838] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] self._fetch_image_if_missing(context, vi) [ 1293.199838] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1293.199838] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] image_fetch(context, vi, tmp_image_ds_loc) [ 1293.199838] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1293.199838] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] images.fetch_image( [ 1293.199838] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1293.199838] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] metadata = IMAGE_API.get(context, image_ref) [ 1293.199838] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1293.199838] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return session.show(context, image_id, [ 1293.200205] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1293.200205] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] _reraise_translated_image_exception(image_id) [ 1293.200205] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1293.200205] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] raise new_exc.with_traceback(exc_trace) [ 1293.200205] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1293.200205] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1293.200205] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1293.200205] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] result = getattr(controller, method)(*args, **kwargs) [ 1293.200205] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1293.200205] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return self._get(image_id) [ 1293.200205] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1293.200205] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1293.200205] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1293.200560] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] resp, body = self.http_client.get(url, headers=header) [ 1293.200560] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1293.200560] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return self.request(url, 'GET', **kwargs) [ 1293.200560] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1293.200560] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return self._handle_response(resp) [ 1293.200560] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1293.200560] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] raise exc.from_response(resp, resp.content) [ 1293.200560] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] nova.exception.ImageNotAuthorized: Not authorized for image e70539a9-144d-4900-807e-914ae0cc8539. [ 1293.200560] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.200560] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] During handling of the above exception, another exception occurred: [ 1293.200560] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.200560] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Traceback (most recent call last): [ 1293.200560] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1293.200915] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] self._build_and_run_instance(context, instance, image, [ 1293.200915] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1293.200915] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] with excutils.save_and_reraise_exception(): [ 1293.200915] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1293.200915] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] self.force_reraise() [ 1293.200915] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1293.200915] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] raise self.value [ 1293.200915] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1293.200915] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] with self.rt.instance_claim(context, instance, node, allocs, [ 1293.200915] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1293.200915] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] self.abort() [ 1293.200915] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1293.200915] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] self.tracker.abort_instance_claim(self.context, self.instance, [ 1293.201408] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1293.201408] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return f(*args, **kwargs) [ 1293.201408] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1293.201408] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] self._unset_instance_host_and_node(instance) [ 1293.201408] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1293.201408] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] instance.save() [ 1293.201408] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1293.201408] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] updates, result = self.indirection_api.object_action( [ 1293.201408] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1293.201408] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return cctxt.call(context, 'object_action', objinst=objinst, [ 1293.201408] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1293.201408] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] result = self.transport._send( [ 1293.201762] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1293.201762] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return self._driver.send(target, ctxt, message, [ 1293.201762] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1293.201762] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1293.201762] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1293.201762] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] raise result [ 1293.201762] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] nova.exception_Remote.InstanceNotFound_Remote: Instance 39cfe606-43a0-4a52-8ec1-433baf7a3aec could not be found. [ 1293.201762] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Traceback (most recent call last): [ 1293.201762] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.201762] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1293.201762] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return getattr(target, method)(*args, **kwargs) [ 1293.201762] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.201762] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1293.202125] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return fn(self, *args, **kwargs) [ 1293.202125] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.202125] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1293.202125] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] old_ref, inst_ref = db.instance_update_and_get_original( [ 1293.202125] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.202125] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1293.202125] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return f(*args, **kwargs) [ 1293.202125] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.202125] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1293.202125] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] with excutils.save_and_reraise_exception() as ectxt: [ 1293.202125] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.202125] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1293.202125] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] self.force_reraise() [ 1293.202125] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.202125] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1293.202585] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] raise self.value [ 1293.202585] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.202585] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1293.202585] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return f(*args, **kwargs) [ 1293.202585] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.202585] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1293.202585] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return f(context, *args, **kwargs) [ 1293.202585] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.202585] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1293.202585] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1293.202585] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.202585] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1293.202585] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] raise exception.InstanceNotFound(instance_id=uuid) [ 1293.202585] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.202585] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] nova.exception.InstanceNotFound: Instance 39cfe606-43a0-4a52-8ec1-433baf7a3aec could not be found. [ 1293.203017] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.203017] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.203017] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] During handling of the above exception, another exception occurred: [ 1293.203017] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.203017] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Traceback (most recent call last): [ 1293.203017] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1293.203017] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] ret = obj(*args, **kwargs) [ 1293.203017] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1293.203017] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] exception_handler_v20(status_code, error_body) [ 1293.203017] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1293.203017] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] raise client_exc(message=error_message, [ 1293.203017] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1293.203017] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Neutron server returns request_ids: ['req-194b593c-dd38-4431-8ec3-3ec1e4eb84b4'] [ 1293.203017] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.203413] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] During handling of the above exception, another exception occurred: [ 1293.203413] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.203413] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] Traceback (most recent call last): [ 1293.203413] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1293.203413] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] self._deallocate_network(context, instance, requested_networks) [ 1293.203413] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1293.203413] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] self.network_api.deallocate_for_instance( [ 1293.203413] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1293.203413] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] data = neutron.list_ports(**search_opts) [ 1293.203413] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1293.203413] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] ret = obj(*args, **kwargs) [ 1293.203413] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1293.203413] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return self.list('ports', self.ports_path, retrieve_all, [ 1293.203793] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1293.203793] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] ret = obj(*args, **kwargs) [ 1293.203793] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1293.203793] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] for r in self._pagination(collection, path, **params): [ 1293.203793] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1293.203793] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] res = self.get(path, params=params) [ 1293.203793] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1293.203793] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] ret = obj(*args, **kwargs) [ 1293.203793] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1293.203793] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return self.retry_request("GET", action, body=body, [ 1293.203793] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1293.203793] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] ret = obj(*args, **kwargs) [ 1293.203793] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1293.204380] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] return self.do_request(method, action, body=body, [ 1293.204380] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1293.204380] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] ret = obj(*args, **kwargs) [ 1293.204380] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1293.204380] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] self._handle_fault_response(status_code, replybody, resp) [ 1293.204380] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1293.204380] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] raise exception.Unauthorized() [ 1293.204380] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] nova.exception.Unauthorized: Not authorized. [ 1293.204380] env[59518]: ERROR nova.compute.manager [instance: 39cfe606-43a0-4a52-8ec1-433baf7a3aec] [ 1293.204380] env[59518]: DEBUG oslo_vmware.rw_handles [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Completed reading data from the image iterator. {{(pid=59518) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1293.204764] env[59518]: DEBUG oslo_vmware.rw_handles [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6c06f1b3-286f-4f8b-9456-a985e9e04a47/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1293.225440] env[59518]: DEBUG oslo_concurrency.lockutils [None req-ab430b6b-1ce8-4d7e-8375-0a16b9e105ed tempest-DeleteServersTestJSON-659144729 tempest-DeleteServersTestJSON-659144729-project-member] Lock "39cfe606-43a0-4a52-8ec1-433baf7a3aec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 343.174s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1339.803738] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1341.449221] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1342.311380] env[59518]: WARNING oslo_vmware.rw_handles [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1342.311380] env[59518]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1342.311380] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1342.311380] env[59518]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1342.311380] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1342.311380] env[59518]: ERROR oslo_vmware.rw_handles response.begin() [ 1342.311380] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1342.311380] env[59518]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1342.311380] env[59518]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1342.311380] env[59518]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1342.311380] env[59518]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1342.311380] env[59518]: ERROR oslo_vmware.rw_handles [ 1342.311994] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Downloaded image file data e70539a9-144d-4900-807e-914ae0cc8539 to vmware_temp/6c06f1b3-286f-4f8b-9456-a985e9e04a47/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1342.313712] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Caching image {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1342.313992] env[59518]: DEBUG nova.virt.vmwareapi.vm_util [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Copying Virtual Disk [datastore1] vmware_temp/6c06f1b3-286f-4f8b-9456-a985e9e04a47/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk to [datastore1] vmware_temp/6c06f1b3-286f-4f8b-9456-a985e9e04a47/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk {{(pid=59518) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1342.314304] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f23b14e1-9093-40a2-8aa2-82265f1c9c74 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1342.322433] env[59518]: DEBUG oslo_vmware.api [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Waiting for the task: (returnval){ [ 1342.322433] env[59518]: value = "task-308029" [ 1342.322433] env[59518]: _type = "Task" [ 1342.322433] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1342.329859] env[59518]: DEBUG oslo_vmware.api [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Task: {'id': task-308029, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1342.833300] env[59518]: DEBUG oslo_vmware.exceptions [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Fault InvalidArgument not matched. {{(pid=59518) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1342.833689] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1342.834198] env[59518]: ERROR nova.compute.manager [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1342.834198] env[59518]: Faults: ['InvalidArgument'] [ 1342.834198] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Traceback (most recent call last): [ 1342.834198] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1342.834198] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] yield resources [ 1342.834198] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1342.834198] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] self.driver.spawn(context, instance, image_meta, [ 1342.834198] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1342.834198] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1342.834198] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1342.834198] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] self._fetch_image_if_missing(context, vi) [ 1342.834198] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1342.834591] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] image_cache(vi, tmp_image_ds_loc) [ 1342.834591] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1342.834591] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] vm_util.copy_virtual_disk( [ 1342.834591] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1342.834591] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] session._wait_for_task(vmdk_copy_task) [ 1342.834591] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1342.834591] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] return self.wait_for_task(task_ref) [ 1342.834591] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1342.834591] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] return evt.wait() [ 1342.834591] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1342.834591] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] result = hub.switch() [ 1342.834591] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1342.834591] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] return self.greenlet.switch() [ 1342.834926] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1342.834926] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] self.f(*self.args, **self.kw) [ 1342.834926] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1342.834926] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] raise exceptions.translate_fault(task_info.error) [ 1342.834926] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1342.834926] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Faults: ['InvalidArgument'] [ 1342.834926] env[59518]: ERROR nova.compute.manager [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] [ 1342.834926] env[59518]: INFO nova.compute.manager [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Terminating instance [ 1342.836203] env[59518]: DEBUG oslo_concurrency.lockutils [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e70539a9-144d-4900-807e-914ae0cc8539/e70539a9-144d-4900-807e-914ae0cc8539.vmdk" {{(pid=59518) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1342.836437] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1342.836695] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bf7fcbbc-8ede-4baf-ace1-7c477e4a83a2 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1342.838919] env[59518]: DEBUG nova.compute.manager [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Start destroying the instance on the hypervisor. {{(pid=59518) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1342.839144] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Destroying instance {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1342.839910] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ed2f441-e7e3-4f8c-ae11-63a932454c28 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1342.847114] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Unregistering the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1342.847341] env[59518]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9f7cc5e1-bae5-4dee-8464-ac019c903cc4 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1342.849555] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1342.849757] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59518) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1342.850763] env[59518]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-255418d1-5ce4-4783-8dce-64b56895f3b8 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1342.861261] env[59518]: DEBUG oslo_vmware.api [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Waiting for the task: (returnval){ [ 1342.861261] env[59518]: value = "session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52aacc85-6adb-a4e5-487c-bf1202d1eafa" [ 1342.861261] env[59518]: _type = "Task" [ 1342.861261] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1342.868793] env[59518]: DEBUG oslo_vmware.api [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Task: {'id': session[52bbfb38-ca8b-5e49-6f07-a6b1bd180a48]52aacc85-6adb-a4e5-487c-bf1202d1eafa, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1342.926751] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Unregistered the VM {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1342.926990] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Deleting contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1342.927203] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Deleting the datastore file [datastore1] 468e2dc5-6a66-401d-b6cd-06bb94cea0ef {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1342.927490] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-368ce656-97ae-4bfc-9441-2500f184c72d {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1342.933390] env[59518]: DEBUG oslo_vmware.api [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Waiting for the task: (returnval){ [ 1342.933390] env[59518]: value = "task-308031" [ 1342.933390] env[59518]: _type = "Task" [ 1342.933390] env[59518]: } to complete. {{(pid=59518) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1342.941294] env[59518]: DEBUG oslo_vmware.api [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Task: {'id': task-308031, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1343.371150] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Preparing fetch location {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1343.371434] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Creating directory with path [datastore1] vmware_temp/03117348-d4ae-4786-95f1-cd014317f798/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1343.371674] env[59518]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-61f84a3f-0292-44b4-9a5f-5fc3e30b81c1 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1343.382711] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Created directory with path [datastore1] vmware_temp/03117348-d4ae-4786-95f1-cd014317f798/e70539a9-144d-4900-807e-914ae0cc8539 {{(pid=59518) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1343.382885] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Fetch image to [datastore1] vmware_temp/03117348-d4ae-4786-95f1-cd014317f798/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk {{(pid=59518) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1343.383050] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to [datastore1] vmware_temp/03117348-d4ae-4786-95f1-cd014317f798/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk on the data store datastore1 {{(pid=59518) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1343.383757] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47175f56-539a-47c8-918c-cd6818f9558a {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1343.389894] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2015106a-d64b-491e-a684-5803ac8eaf26 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1343.399783] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2d1aa70-14ea-4e4e-974e-bdb5b0d43336 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1344.066825] env[59518]: DEBUG oslo_vmware.api [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Task: {'id': task-308031, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066811} completed successfully. {{(pid=59518) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1344.089457] env[59518]: DEBUG nova.virt.vmwareapi.ds_util [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Deleted the datastore file {{(pid=59518) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1344.089640] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Deleted contents of the VM from datastore datastore1 {{(pid=59518) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1344.089801] env[59518]: DEBUG nova.virt.vmwareapi.vmops [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Instance destroyed {{(pid=59518) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1344.089962] env[59518]: INFO nova.compute.manager [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Took 1.25 seconds to destroy the instance on the hypervisor. [ 1344.092217] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc51a41f-db20-48a5-8a73-1697445c2a13 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1344.094786] env[59518]: DEBUG nova.compute.claims [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Aborting claim: {{(pid=59518) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1344.094944] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1344.095143] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1344.102513] env[59518]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fa6f28d9-140c-46cc-b2ad-1a6d4a35aa2c {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1344.119156] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1344.120023] env[59518]: DEBUG nova.compute.utils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Instance 468e2dc5-6a66-401d-b6cd-06bb94cea0ef could not be found. {{(pid=59518) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1344.121781] env[59518]: DEBUG nova.compute.manager [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Instance disappeared during build. {{(pid=59518) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1344.122074] env[59518]: DEBUG nova.compute.manager [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Unplugging VIFs for instance {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1344.122379] env[59518]: DEBUG nova.compute.manager [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59518) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1344.122665] env[59518]: DEBUG nova.compute.manager [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Deallocating network for instance {{(pid=59518) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1344.123042] env[59518]: DEBUG nova.network.neutron [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] deallocate_for_instance() {{(pid=59518) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1344.126737] env[59518]: DEBUG nova.virt.vmwareapi.images [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Downloading image file data e70539a9-144d-4900-807e-914ae0cc8539 to the data store datastore1 {{(pid=59518) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1344.149281] env[59518]: DEBUG nova.network.neutron [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Updating instance_info_cache with network_info: [] {{(pid=59518) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1344.157781] env[59518]: INFO nova.compute.manager [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] Took 0.03 seconds to deallocate network for instance. [ 1344.173339] env[59518]: DEBUG oslo_vmware.rw_handles [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/03117348-d4ae-4786-95f1-cd014317f798/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1344.227362] env[59518]: DEBUG oslo_vmware.rw_handles [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Completed reading data from the image iterator. {{(pid=59518) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1344.227555] env[59518]: DEBUG oslo_vmware.rw_handles [None req-a5b9e622-327a-42db-8ffb-8e95c449b3ad tempest-ServersTestJSON-439153303 tempest-ServersTestJSON-439153303-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/03117348-d4ae-4786-95f1-cd014317f798/e70539a9-144d-4900-807e-914ae0cc8539/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59518) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1344.244056] env[59518]: DEBUG oslo_concurrency.lockutils [None req-4e95b399-6f10-4b29-9e75-1afc9454a45e tempest-ServerMetadataNegativeTestJSON-1362833201 tempest-ServerMetadataNegativeTestJSON-1362833201-project-member] Lock "468e2dc5-6a66-401d-b6cd-06bb94cea0ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 290.816s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1344.244208] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "468e2dc5-6a66-401d-b6cd-06bb94cea0ef" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 133.641s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1344.244403] env[59518]: INFO nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 468e2dc5-6a66-401d-b6cd-06bb94cea0ef] During sync_power_state the instance has a pending task (spawning). Skip. [ 1344.244631] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "468e2dc5-6a66-401d-b6cd-06bb94cea0ef" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1347.449542] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1347.449973] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Starting heal instance info cache {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1347.449973] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Rebuilding the list of instances to heal {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1347.461046] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Skipping network cache update for instance because it is Building. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1347.461221] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Didn't find any instances for network info cache update. {{(pid=59518) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1348.447369] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1349.448471] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1349.448865] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1350.442800] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1350.447434] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1350.447591] env[59518]: DEBUG nova.compute.manager [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59518) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1350.447753] env[59518]: DEBUG oslo_service.periodic_task [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59518) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1350.458793] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1350.459121] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1350.459161] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1350.459304] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59518) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1350.460366] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b263049d-3558-40ee-94c9-2eb1bcc4ba1f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1350.468918] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65dc44d6-bf15-4e9b-bb07-16174eaa3926 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1350.483014] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2832c73b-2347-4649-bcda-d87460645107 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1350.488980] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e808ac76-e289-4ad1-9849-0c06cc4ce66f {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1350.516892] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181728MB free_disk=174GB free_vcpus=48 pci_devices=None {{(pid=59518) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1350.517022] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1350.517195] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1350.552777] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Instance 8b692644-9080-4ffd-89b3-c8cd64de0e4f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59518) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1350.552974] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1350.553112] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=59518) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1350.576677] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b3e637d-f466-4a79-a366-67c69b72eefa {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1350.583348] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f679f720-4823-40f2-a6b4-47f8d60c8be2 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1350.613090] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa348f01-d72b-4c01-801f-380d796ded75 {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1350.619565] env[59518]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae2a59a2-902b-4fb2-94ec-980d35ff63aa {{(pid=59518) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1350.631882] env[59518]: DEBUG nova.compute.provider_tree [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed in ProviderTree for provider: ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd {{(pid=59518) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1350.640069] env[59518]: DEBUG nova.scheduler.client.report [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Inventory has not changed for provider ad495ffb-c2e4-4a1d-9e2f-9d3d73327fcd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 174, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59518) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1350.654381] env[59518]: DEBUG nova.compute.resource_tracker [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59518) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1350.654544] env[59518]: DEBUG oslo_concurrency.lockutils [None req-c7745552-3b42-4690-a2c6-a3eac2156aa5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.137s {{(pid=59518) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1360.119137] env[59518]: DEBUG nova.compute.manager [req-d0496a9c-5b2c-4b84-b3f2-0c0a60b216ca req-f69f54b4-4d14-4c1f-9aa7-5a675277527e service nova] [instance: 8b692644-9080-4ffd-89b3-c8cd64de0e4f] Received event network-vif-deleted-31e590b2-078f-4f45-a603-0ba5a409b415 {{(pid=59518) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}}